var/home/core/zuul-output/0000755000175000017500000000000015154774264014544 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015155005161015470 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000302730015155005100020243 0ustar corecore@ ikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9GfͅJ}6i.߷;U/;Yw?.y7W޾n^X/ixK|1Ool_~yyiw|zxV^֯v5gCh31 )Kh3i J1hG{aD4iӌçN/e] o;iF]u54!h/9Y@$9GAOI=2,!N{\00{B"唄(".V.U) _.f*g,Z0>?<;~9.뙘 vKAb;-$JRPţ*描Լf^`iwoW~SL2uQO)qai]>yE*,?k 9Z29}}(4ҲIFyG -^W6yY<*uvf d |TRZ;j?| |!I糓 sw`{s0Aȶ9W E%*mG:tëoG(;h0!}qfJz硂Ϧ4Ck9]٣Z%T%x~5r.N`$g`Խ!:*Wni|QXj0NbYe獸]fNdƭwq <ć;_ʧNs9[(=!@Q,}s=LN YlYd'Z;o.K'_-הp|A*Z*}QJ0SqAYE0i5P-$̿<_d^"]}Z|-5rC wjof'(%*݅^J">CMMQQ؏*ΧL ߁NPi?$;g&立q^-:}KA8Nnn6C;XHK:lL4Aْ .vqHP"P.dTrcD Yjz_aL_8};\N<:R€ N0RQ⚮FkeZ< )VCRQrC|}nw_~ܥ0~fgKAw^};fs)1K MޠPBUB1J{Ⱦ79`®3uO0T-Oy+tǭQI%Q$SiJ. 9F[L1c!zG|k{kEu+Q & "> 3J?5OͩLH.:;?֖QʡCOx]*9W C;6)SCVOאUʇq )$ {SG!pN7,/M(.ΰdƛޜP16$ c:!%Piocej_H!CEF L훨bِp{!*({bʂAtĘ5dw9}ŒEanvVZ?B巻?qr7@sON_}릶ytoy͟מseQv^sP3.sP1'Ns}d_ս=f1Jid % Jwe`40^|ǜd]z dJR-Дxq4lZ,Z[|e 'Ƙ$b2JOh k[b>¾h[;:>OM=y)֖[Sm5*_?$cjf `~ߛUIOvl/.4`P{d056 %w ^?sʫ"nK)D}O >%9r}1j#e[tRQ9*ء !ǨLJ- upƜ/4cY\[|Xs;ɾ7-<S1wg y &SL9qk;NP> ,wդjtah-j:_[;4Wg_0K>є0vNۈ/ze={< 1;/STcD,ڙ`[3XPo0TXx ZYޏ=S-ܑ2ƹڞ7կZ8m1`qAewQT*:ÊxtŨ!u}$K6tem@t):êtx: `)L`m GƂ%k1羨(zv:U!2`cV, lNdV5m$/KFS#0gLwNO6¨h}'XvوPkWn}/7d*1q* c0.$\+XND]P*84[߷Q뽃J޸8iD WPC49 *#LC ءzCwS%'m'3ܚ|otoʉ!9:PZ"ρ5M^kVځIX%G^{;+Fi7Z(ZN~;MM/u2}ݼPݫedKAd#[ BeMP6" YǨ 0vyv?7R F"}8&q]ows!Z!C4g*8n]rMQ ;N>Sr??Ӽ]\+hSQזL c̖F4BJ2ᮚ苮p(r%Q 6<$(Ӣ(RvA A-,d~䞐Xa>EE衢^}p/:F?}bi0>Oh%\x(bdF"F 'u Qx`j#(g6zƯRo(lџŤn߃x[䚒Da,Gܠ*qI@qlG RF]NHw2k߸>6!<̅:xn<# -BȢ1I~ŋ-*|`В~>ۅm}67X9z=Oa Am]fnޤ{"hd߃Ԉ|tLD3 7'yOc& LFs%B!s_tean`3-SHq$2[ĜSjXRx?}-m6Mw'yR3q㕐)HW'X1BEb $xd(21i)//_і/Cޮm0VKz>I; >d[5Z=4>5!!T@[4 1.x XF`,?Hh]b-#3J( &uz u8.00-(9ŽZcX Jٯ^蒋*k.\ME/Xp9VqNo}#ƓOފgv[r*hy| IϭR-$$m!-W'wTi:4F5^z3/[{1LKߞ[2nM|[<\t=3^qOp4y}|B}yu}뚬"P.ԘBn방u<#< A Q(j%e1!gkqiP(-ʢ-b'$66|*f\#ߍpg8sx[o%wS`ýͽ>^U_S1VF20:d T2$47mSl*#lzFP߫O֙`mn Lv%7mSU@n_Vۀl9BIcSxlT![`[clzFض˪.l >7l@ΖLl gEj gWUDnr7AG;lU6ieabp៚U|,}S@t1:X _ .xI_7ve Z@7IzQpɡr~]Si!ڣZmʢ鉗phw j8\c4>0` R?da,ȍ/ءfQ 2ؐfc}l 2窾ۉ1kw |VvlK۴ymkiK_oK`8 )v3vנ:b(v6& `-K;~:|F6vXpw*t]r@ 5 ƻ7۱ַ P񷍋 3)Cl^]U҅yY9 &K<-na'Xk,P4+`Þ__e^0)?]{ @| 6+#gPX>Bk2_@L `CZ?z3~ }[ tŪ)۲-9ֆP_xL 2ڲ]>>i+m^CM&WTj7ȗE!NC6P}H`c(FUM gul)b ;2n6'k}ˍ[`-fYX_pL +1wu(#'3"fxsuҮױdy.0]?=b+ uV4}rdM$ѢIA$3~Lvigu+]NC5ÿ nNჶT@~ܥ 7-mU,\rXmQALglNʆ P7k%v>"WCyVtnV K`pC?fE?~fjBwU&'ᚡilPї`m] leu]+?T4v\% ;qF0qV(]pP4W =d#t ru\M{Nj.~27)p|Vn60֭l$4԰vg`_R<lb#P-^39T|L /~p_ eVVޖ~;BLv[n|viPjbMeO?!hEfޮ])4 ?KN1o<]0Bg9lldXuT ʑ!Iu2ʌnB5*<^I^~G;Ja߄bJȌsK+D"̽E"Icƀsu0,gy(&TI޽*}w[ #j*ٚ- DIAmPvIɦ7聀t>G;_H;2ʗ6 h6QװxmR JQUbTP2j˔Ni)C)HKE"$ӝ!@2<Bq 2oh80,kNA7,?ע|tC3.㤣TiHEIǢƅaeGF$ u2`d)/-st{D1kl)Fa/TTmCԤ{"9b{ywSXE*m#3U ùRIvޏrJ`k|wJKH:O*OKy`( ݢe*{ ua Ȼݔhvׄӫ A^%f+[`sb˟ _.6KsjA Qsmd  O#F.Uf28ZAgy>y,d$C?v01q5e.Um>]RLa&r?+@6k&#l)I5_> ` D s5npo}/ؙq #a2V?X~.4O/'|//>l8MrHID2VSsMX^"NۯDc558c&'K0L /C5YDqNe~ض˸nErc֋@aw*r܀0 a {RQ^xZb [_tܡ&yЋ{ Sym^?̑sU~' Ԓ f\itu)b>5X -$s?{ƱmPQ&sľ$~s% _z$- $B*hv?~MZ]DaUS@''mhSt6"+ҶT o\戔-QB EM;oH$$]?4~YrXY%Ο@oHwlXiW\ΡbN}l4VX|"0]! YcVi)@kF;'ta%*xU㔸,A|@WJfVP6`ڼ3qY.[U BTR0u$$hG$0NpF]\ݗe$?# #:001w<{{B\rhGgV^{=ÛU<0EX/7. uOewUjx˳hvX%e"xm뻱~0GBeFO0ޑ]w(zM6j\v00?օYɓHڦd%NzT@gID!EL2$%Ӧ{(gL pWkn\SDKIIKWi^9)N?[tLjV}}O͌:&c!JC{J` nKlȉW$)YLE%I:/8)*H|]}\E$V*#(G;3U-;q7KǰfξC?ke`~UK mtIC8^P_fub8P銗KDi'U6K×5 .]H<$ ^D'!" b1D8,?tT q lKxDȜOY2S3ҁ%mo(YT\3}sѦoY=-- /IDd6Gs ={F۴'c,QAIٰ9JXOz);B= @%AIt0v[&FJE͙A~IQ%iShnMІt.޿>y=$ts,cJZڗOx2c6 .1zҪR "^Q[ TF )㢥M-GicQ\BL(hO7zNa>>'(Kgc{>/MoD8q̒vv73'9p忹M&jV3=ɹvYƛ{3iψI4Kp5 d2oOgd||K>R1Qzi#f>夑3KմԔ萴%|xyv>ķx>{E>Z4Ӥ͋#+hI{hNZt 9`b˝`yB,Ȍ=6Z" 8L O)&Oo7\7ix@ D߬P"~G YdЦhhC{[J{:LKH:(+C5YŴBiW) 3C(Pymg+DnA$uSgu$m:8)pA{kI:BX'MH6@w钐֋H{xYEE>6nOf5~IJ|~!yKڮ2 h ob9%islԃ)Hc`ebw|Ī Zg0FRYeO:Xy%wK{)qH+"B4H7-]r}7v8|׾~Us?yWfv3>xpRҧH-EeJ~4YIozi:nq Vq8swHOzf ̙eX-4`TDGq G.tݻgq74ŠqBFf8 9Fk Afq#ϛa$!qNCJ4bnv!|+R@;HspSI]ڻCZUcg5pDcIϹ,oN-_XI,3\j ]_5~NW?҆8oC!IMo:^G10EY↘H:L@D+dˠUHs[hiҕ|֏G/G`' m5p|:9U8PZ7Yݷ/7cs=ؘ+~PR iE^1x5/[O6rpP40ޢE_A͝ Z5 om2p)lbp/bj_d{R\' 礅_}=\:Nb{}IStgq$<$ilb)n&  $uT{wD]2cM(%YjDktByxVl巳1~jpd1O9Á%˧Byd}gs9QNʟ. /ӦxbHHAni5(~p>/O0vEWZ nY3 cU $O,iLacoW1/W=-kqb>SHrC[KqWs7ѝBYǭ~RR"p9dFg|K- obY_vM 4>/]e/dy,8!xŋ5 R<^mYo 3c9(F\4ZfR-dVaz./f+yGNMGOK?2_~3\z=y}^G$*A! IcuR.o=MZ9zu b#s9@*иrIA*qQN|Ix;I}&ݢ6ɢ}{]x}_o>Mp' *=]Q$b =ݧnA<?r8FHٗǹ"~;=?'TMUcw8=B[nUgu$ B6 [^7 |Xpn2]nr CC5`F`J `rKJ;?28¢E WiBhFa[|ݩSRO3]J-҅31,jl3Y QuH vΎ]n_2a62;VI/ɮ|Lu>'$0&*m.)HzzBvU0h} -_.7^nya+CTm>C9O n6HNe">0]8@*0)QsUN8t^N+mXU q2EDö0^R) hCt{d}ܜFnԴ.2W⠪R/r| w,?Vo3o}CW멷5k7;;64H+OtfI|RL+T>y:1V(!L7,RPEd;)QϢ +RlWDžuF7LFֆ/M~ar*EtIbW>jqour?qzJJaQ#-n`/$fhnqgTĔO5 ꐌSYXzv9[ezksA`<dkONo_ULT.ÔD[%s1,jЅ@k0Ցu֯dtKl$Y5O*GUڇvI`b0ο0~oI`b#FOf_$0!i rS/wvҍ%Eb/Ec|U9F-)L)ŘF`U:VK jeFrԋ7EDYpԽ.D\dNyj荊EEg]bÔF˩ք%EGƶ*NX)Hc(<|q@Oޯr^3>Uf1w;mCja:-1_k٘%VbZ˙#G6 `q+MPU~l!.?I_Pĝ"] rT [eTr؟˰ ]\ h! v˱>5S1px fnk}sRmA>d2UAkؖvlX܇Bz1U_#Xӫ+al H d\k/I,k,ρ|`zR/$@8VU^rcG"E7\qtS:ڝUyy >Vc11*?xYa8U`Jw/AcL~|;yj8TR#s"Q.ϊ/Yrx+u6*27fǪC%+A~*Zآ'ѭnۡ|< a1s\ T5҃}7_bi\X%d'D}Yplm@+@g~"s_JF7c_>z@JM->=@NY͐S)Ea*B<)tc(߯)Z]k5>.1C( .K5g&_P9&`|8|LdO&i"< :]_<<7U_~z5є/rfn͝ȟMLmc 6&)e+n7cyy{_~궼07R7wPuqpqo{ߟ+[w_uwqu- +[./:n㖿'/8no!q݅l@۴Sq 4ѭm_9p -h2 dֲ 1"j {]]Nk"䁖%5'32hDz O\!f3KX0kIKq"H~%.b@:Oec6^:V8FDza5H`:&Q5 ^hI8nʁu EA~V O8Z-mYO!tO֠υ9G`6qmJc,Qh: ݢKNw2taC0Z' O > f-?`:F_Ѫ2)sCj1THɩhS-^p b~?.>, `0!E%ҏ:H =VՑӄ| Ć.lL t1]}r^nʂI-|i*'yW='W6M$oeB,޳X$I6c>EK# 15ۑO2Jh)8Vgl0v/eNEU"Ik dRu˜6Uǖ xs%P ع omWl҈sApX!^ Ɩgv{Xn|$̇d`>1Ljn떚F+B9l"UP۾u2Ja>0c0Vvގj$]p^M+f~@9{bOe@7ȱ^%u~-B竟} |23 Z.`oqD>t@N _7c$h3`lg\)[h+pHBr^J |r\8czEnv@qZbRT1e8V Scc6:$[|a.fpU`ZR֩bKgTlѩynۢ, "1LӰW&jDkM~! (C>ϭQ3{ߤ%EN;?P%ٱm -{2k 8Vbv"wŏݙmn&bX[cO;R`RA]d+w!e rr솜[/V`+@;Τ`5d0ϕ_Lع`C"cK>JG.}Ε00e>& 2䯫vNj31c$ i '2Sn-51Y}rE~b>|Ď6Oj~ebIapul9| 3QtUqSCxTD7U9/nq.JYCtuc nrCtVDƖϧ;INOKx%'t+sFUJq:ǫf!NRT1D(3.8Q;І?O+JL0SU%jfˬ1lމZ|VA/.ȍȱh M-r ~[0AG꠭y*8D*-Rz_z{/S[*"꫒?`a;N6uilLn<Yllmb rY״͆jqTI!j.Pٱh s!:W_´KxA|Hk1nE6=W|$O -{]1Ak$ ѫQ6Plp;3F$RveL l5`:~@c>q,7}VE-Q8W70up˳ A¦g/OEU:غA>?=CۣPqȅlW11/$f*0@б 2Dݘrt +qr<,BLqL򱷬dS{X6"X#-^䀕#{К4i̎'QIc(<ǩJi lc*n;YKOIXA|{W6/Ke]nvӴ9].Hٺ%U/N_ȑ8iQLmm[ᐜRZ$)(AˤJpד9TDMQP$_.wCUUa8T(ET#ʢL*qTڡ)GiT"D櫃H`eДU=onue='<(jsDUajn"[V X'R<(CUed!j;f<.CU\{>UV͌ZhE8NMw@U] H^]U#CV`4I$iƸf/qSQp'Y$`arCաJ vI=ҋ= ]7Џ-//C0Ɨ]f2Z{eq^ly{Y\ac?6Ww;o`~ȂŕŽ-eqiu02]Њ 5\ȎyCL;B+yYOj, =zNLJcJe ٌz?U#FM 6Yw;> =t{6P3_Ԙu;;an@;Nf ^LkK%&"-ݡ!9w*m|2\i__X fvyP4GaY 7Q1`7+tMs=)B/˵uwX}#{iuqL7]K7=s̟"F$?U讼aǸŖB /w-o݅"g*%2[G1b\3,GFhXDnZag; 3ߌCgA3?yE y G]4|=(QwrؚL󐧓 $Ȭ*yQ|qiw? q-‡a",H6i%(oȢ"O@ @lⵈ^3OQK*Gww0M-DK,XI{BΠPu0k}TFAuȏ0JTЄb/Sֹo" E+c>ӅsT0J&Y@(9c.+7͹>ſΦKaUid#FF]n!K!@GHo޿i"\͏}D}^jR^Hku DK7˳"o* ɥQ>y/ŢHh_ZW4Y$d S:v%y_<"`4٠ !=5_MzW2ƹtz ptMcQX  ֒H>>eN*L.Pw:)!:EѠ=T5|؞HOn8hdU()7.mWE{H4G˘ ^?%C[/yE_<)q&` ^`4jAUjk&2t۳R3ԤEk /{QYŁi?4۱ZCXJJݗ9yLp![4@v[k9OC9O[7wovK= K : 4@~_$yct  ˆ\2aFlj1gVk{}¢¸ҾCҟ@$dL݃MNe.WGٵSj_ye>%pUAVy5Z$!TzADf09:~^1S C`>N/T6SPh2˃ Yt’M4GRybuД=Yۇڗ")C O=D̛>&ҷD}5,xXZ*/pC 1=+aXK"^_V0^gI{}de%-uQGfYlWCd˙e}BmZ=y;Sp:C#2i~uN2-u#ʽkX3IA>ʆ{HHe ~;RW3ejeyА6=q ,3{P[%#jح Jb7Ii25.]a6m!TmE xn2!/x+-dwv\%m:*B3m 5pO"ydB" ~ɐ#I<բM)"^mגTۦyЪw4x`|ߧTE)W&uso/ _IZ+]a-qALg. *s@*Sh`iUwHWX@ *czrwcyM=}5[!39k{t@fu,3TZe ʤ]7 R4Ǧ/^b.=,yչi *hm|PDӪMYb~b VcѪ&o=yu=8,2zڥX "Uբ:/4Ql7\+M~"*&-I#^IZk<"DuTiV]sm۸ as^ؤ;nLߛRQug7Bx3]>O~DA8d ߧJ}܏o ##EBb/X?2[@Xbpyrm(1})sOmx؏b.u=}ESp iK}>gch`,%eZxƵ_!&`&m'MڙE Kr[Jvl1}!%~cJbgI"Qw_<8@:q@1[7McgU:{uvyJӬ,?Nj)$[#( l)'Z˯k\-}Npo q:Nn1P-bs:m`zohFy7R;FE(@+&o_˪Ek-|S-(p;Ƕzt׼ȸvbuqyO@9A]F`r{~x P44]UE,ƺ X'E ۮu\+n`7c: -3Kc9l-ܟ3[#v4ǀv$`Ln)n@hDȴlDR @c }J?*u'@8sN88Q2X#4s4Bf};-1hJsi2Yc_KbQEZ$ l("E.{IV|򢒖|VΠv0\n `gGW+X,Nn#S"/ ő1ɇdvkV `.4ċawQe~UD*ӯ FwH0\Ȥ6N!?d<=Z U/\PT!%堮 .J4π) ֥&;EZ1 2pDvI4 2,11ֺf%7eo0 ٛcw2X i|6j,I f4$ܚ#@ ؀R @ͰK U(]W,9;uqKH CّqrIzON9b1gm Tȋ́(t.hԋBL zӎ]װ^=?"Rf7x{1EŚ!2E)$q43\VU5"q;M˰`&E7Mm[쫃ܱDz,1aZC>-G"F&O{Q42]4w~)yC%bv_%8<#w @}x&;vk0ó؛df[aE83%ai<@\#i?2FljRꭃ?az'z)($̘i 7Vd)ww$+2e:̤T( JgΫ7$)o@dЄ{>T ('XSERn/pCo1JB PyR(*:l<d: 韤!`&jë\\ 7ku-_װZco#ay+ ֵV*r-WV* j)Vd?͛ULUj 6l&Y`H_RB @G=T%'K'I^sX?`ሏOɻ#@./nJF3|t%4OQd:- hܧtZ *Tuɗ8o>wMyMuԡx"$ _7ھ?O5>Hxoj ,̧,u=Ƕ=dgVr#/Ncr%\ÀEzP2A8M/6c?C&^M;\ӎoDZ`SRXh _>K9P;g46>y{ьY OF@m %-)lf8\oi Ic'ꯈe-\n77eB e$BJ]ѩ^ev+۩_W;vݧk۽;w#I)߁PkPkB #ړPkBeBw'>P{OBu uv ٝP0B= uv ]&݁PwwBu$݁PoPoB #ۓPoBeBw'?POB v 4؝0B= 6K3 %B/7j2'DiCP;5<գ:˻נ8Ocq]VF0Fm>&G`OH)5zRaH;a ˴(ITM_JO;WsHPO$fUg|u43 $E nE-S",)d*b}j`:86{i!i4=Zwܔ(MqPV%m#\Pd-+t'B# Y "#%[*3 Bb&(tP&\|eY"|j#JȚ0H_peլj&a RB?dQXB\x)O FxȆ舨H _oZߣ:G| 'n/x pFT.gb .$Qf \F,`~ -vX>qNfz9+n{d$J~i|p-WH#I'hTnMPLEraKC2ΰ`? uzU4PHޤ3W2: W h3*PS2 :EKG?UTTݕTa`syqYa˼~̮SrѴ=Qs schd x vp%kKonF|Q Ae Qu(GSlq}*NA֯J2t=JUDtfڨ~˱8hGϋBX@2| H"Š6/g -2i2,(wk)Ӹʸ̴ʝMp~r -v0â\#E($\YiI9sԆQ ;HX-#Q_gW=i h<Ɣa!R#0+-#v|dОCK=/@›NmoxU|DDJR~ZW |AgZ:J-o \H~2.N)Uw".RE$5G|} l>irCZ`!iN&)g6H`ՙfOxڴU+c\JS^>եN3f@yS* K!5ӯ+[RWeS\^}l!ZZ5ht>BcԐJ-,ݎ%PQ *Mqxl^;!ۑP8M*xQSIbx K>Ckg~ufJb͝sty*KVF9[\c@92w>I"B4Z+{OV@u';jn_ie0sPu`j pR]Vy{BbbHb[[CJtFWvBڈ\Y }ԫ\T[KqX{,|X>k8{bAmrQ䓮wlzE:C 194qUfΤI2(?,dfR@Kt>GMV3mRwHэHn 5a]x<#K=^G.^z_d GH4(M?#J%/QǑȻpoA< !m%yW摤JϪq߇b{` )1(%[~Iu*^0|IwW87bχR"?~SSJ-S/Vco^1?N {_ n~_[Zawg۰/w⍆{ MV>N<>W6j?iє+Xfmo` (擏fj/|fZGb7˷l?ͿL!ڄNJ nnW0^حGqzK̹뀻wZD!lXg -ҿ->7I,E'eZ.}̒Pd+ V /!'%bS袅ejk'x0Ph9@-QZ:HPh8^IObi`(+k36p"GJ)InK6xYJlp'x栙bL`tF_n@1O lr~pCt6U#w`R;\*Kp09Osϊ‡Jp"*0sP}E tgL-Ss.BYFa⥥T t X8›P(PŘA:AV񬃶*/ #o/!E`B+&2 90πb4f*yX]GQc+U+Yx½#!JЉ`י)fĻnOy1v$:$krm-A->2E/q@aBA;%`6xr~!;^<1OS.S(B@E𸻬FÓb ڱ,r(jx REVWSdzyK>.8TgHRƅ+38H3^qIhbg̭SƥA@qc҆ժ` ϼ[/ހ}~,Cp# 7VdZk.Q"̱C*pg莝`e*N88grG?6Й@A 3=!J#A cة|l ]C WD=TELJD[(+X'M^ .qykH`T ܀/ 58hb!ݲ v,CYtD׆CLB7L:"CXǻxٓYU.xwNўʕy*.'bk  ˤLݑ~K9 K]*9WJiYV,]lė|g#,QR3U%@,jJ\A(1|J{lj3)1=T* qyRbw̱y6WъRs{}.XVԕ(M0ZIe2w}GJ۝1x) JΡ &?=820YKX^sT(2`! ).y%ا$[TV"?GXT.'%X][Ke۴uAY!yڑ$$/K]p~V]c27;/Pg-0|SAe; ,YMLY.رY)M!jhYO͍3'$R̼n^ҲyÝ>~{=.w߹'LJ 쟿8M+ %&K1F@-R P^egKgL`8+Z w! @H?W9E.PH=Pt|SRʗOk^.W'?n >3àAt!"XBj)3qvzpnXzjJC̒mT)`}p,umNtݑrrFJDqkA!֪hnfF('`!7môÃXefpk)FXl=L[ZA*+-uL{30˒AU˕DRE҅fn F_'UJ "M/TĐ4үTo kl |BtOrʹǘbi@n]j),jl1޲,x%)9dF X&Ou3*68~mR]ܮs|_tΘoB ӶФۆ|B߸^m ~KJi qm#=pK4KMJ1"} tSIp6e!:87j ^% /"R+FϿ<2H]>]GP] ĚވKT aسFQ ڪ[ Aߣ27oyjNB)K'X1\n88%1R[ֵKlpUHȖpKtNoRhcD/$U'9,q)p˙ VxQ,]6.[]p*}=+vS朙xn=בֿ|d%!f~ļ|׽O}9X@Z٤x$֪w9g#(|+[\{c]w56m<)(UjP!Ao1Ή-K#xjkK~(U ϻ.q-ج Z)ObVDk[Y[{{PZq(f0˰*۲9WȟFh/۴[BX..00_;L 2)"Rvϵ8^idaT u:eD}~)iJfp3T٤Э[G-ڔKw=ӏ5;;]BY9ͳD<~|uqfm=. JOּl)߯78/2aׅ/:!f=Q±9(9esLlv1Jx9]bnp~opOO9CGhe[~ޟ\0xçMyvwD 8Oft'Һ T_ǨCQB'M!3k4SfV >/Dj)yףCf;^RFoR9if͙ނNK];]XI-AhV_g!JENUMDoY(X=D/fEd)f|VꁣjVzH[ʬ hRFV@\!6FJV+Qbb Nt\u!Zq$+܆,БH)fՊrTtnpصbl%3ZWmmWOw:olgh˼XqxC2ko(k ' ?)l> /b?^[]uK]O)Q%_KVLH\qA-:/j]f9+Fv決kWŅU"3ˤ1UHtHkytG٫Z oB)z"!PAA;%ɚx)`}Sj:]Ȝ8kxGFϫ,J龸 ۻ.8OKt">v6i*8k*i߶tٙjתkjP11UI!½[ή@x 1*}S4R~x!>tX̐;Vie&ruB8r^$Md/w]p4zVU&g8RLx/ 3]^3-j\g@2Ȩ6jb;Rw)+j"zoEPb /ĤA Rt-Q`clrGGl,+WA׌IAbbcFgK=-~pƹ)?\]vHH0) d|!Tn'VDQ,]jCG\yC)Pnj OrGR?f%v8$1ΤWHO8HCCZĤpbb Vj`8f; :{I(CzsG+PG Jjyv]vFhѝ>TN#o f,}[Y;񶲈R̈/RxbHN26 YE42afu׃b "}r<}H(n4;0ןK]TLgr)58~z GjeSD9?q'I/*؅j>z?JI[_O]p,qm넒f~6;G#wOfxmi OnARv=p@vmAyiGg5G;Vmi!.mqJx IC<2$nMH^dz޲ e0Bw? Oe^JUK\#kuo悉Jt x I()RPz(~ErcF>p$|x޹)k| ݙO >8|Yx >SگX-KB.ΫVR̴MKܷ~(vN<6/{۸``{<~7(&`sNG(Ji:ヌN2#۱AXyyܘE{v&6x· +-r  W #obac]l6y`uAeN5(vtˑ"ĂlSiE\h&?'6DK.n<[>s zBn8ɥITarno,!p[ԭKG-Si!(2V[gz#6!z_|˵=u\9"vm-Y8jye®7QBĖK)Qsm g BlWRc<$6Dy{o0nj,q2-Ef& DlEjq?w28 rqذi#5mVV[v]RA-U.e^f,rVz~$IJ"w_h GP~Kʱy|ׄ*rG ?/|"޻\_nGړz~f9QP$W?uȔEdJ X '](7.J@z+>iA߭▓[_&?NSQ|o7ܼ)O!<1z*E|NP2m2_TpϬ~x"_dIzW$:ctFa(Fy+[ puP>et@'Ž_YBe7qR-`D c~uv~P$T%..D=*bp U.sPѧlt#JuH8JgJy N?X=K$=s{# |ᵡëa ȎW&@!(&$W.hy3̵D7ntyvug:v7\ݹCu`O%~!i. Կ[Ў+tFb" \Ciq|e8끄a~M0 jGd=,EG _uLh4ޑ (=KsN[<c/ʾq9fown6:}*?& FڑiADFH#LUh퍊oy[U 01W0dzWƧlz8K\bɌ{] c [ '.AL*zdd*WU|b᎚[OJϠ˳eO/e0ےWzȉrq.=˟VF,Of29uRDoڦZIzu-rS&8.r5d]cDD6Q*QLKH%ԳQ1kԢrH%|ʸAr(Ě#4hVBq֤Qf qs 8i mFXC?MY R&nlPhz>7(,x7ɛ)]N5Ŋ:ojZoU՗Zs i(7r+K6(LyڠX PK3)Ԩ釙nPP`2|' `PLDgr?|DN(,#V"NW+A%&xtQͲP vm:'-PF a1EvCHHtWYz3-!Lwg˵A 0x(eVJ؊3pavU?zcye) Yrؚ$FBgE7ekA-1h[kDak%*,ӰM|㇒z/Ka[k3bd_FBp6ɺ#. *L+ӳ`Il\L[s.q7;KO80-?"gs#)Qwlr#3 ^DߞywRAjKzujb/JʩRj}( 8,T#@U7P *BTaePwO~{<|+'f\XͺZhJ蚙q8,Y_9#ƌ$oc`ߧ NSFOovXEٺTU?n'[Ƹ=,YyKػg#~﹝ct![g[ǧ'f!R _ޛrs :R?Qw2WO]`I㸫xը,\^)T]" <1ݸ諎*zXv&D%okς7 n'5A#R ęJ wf|G*eE;zOO[>LRyV!dOWϸDyv*D߽ !tL,$XU}7+[nG+E0*V㕯c8PHM%!G,6a]WyIcmlZf=ѳӓűvRXGӬ6:(J)ZL.ӯh[`-+[ݍIUR<)ZCiδDRhZX4'B'ƉTj8 ZR@J]:1IC$\SԘZ)ZVGjdpRkbUY]LMYScGbg\1u82cP"!jȳ&) f0IMy,-c&qUXIbS'HZ PDn4Jkdf7FK+qy̺<pl8V4h0CdbՊ!0! #)Fp2f E呄PENNMmľ 6IRMm* ȥ9MXl[Xa7+DRi*56E(y8X00lLaҘD565iTU⬼byNN8R/4ƪANc֜ ԢRYG-w8Ŏ똦̂)G\ba 's 3R&9䝖7xVӚTzhSDQ2(V يflū;> jZL˔$w[AiqFfh}׊ߓViNj|4,|g9yܲ &0զ%z_6dpJ; Qk0"si%\ 9Zg)HL?iJTtU NTD)5!xWfoԻ LݡI+ڙIjѝIk$nTBim ݒ@C´d=3L)Ŭ~KC5yW(H#xӸSZ=U{Fj{Sz!4)_0" ~ kLN 2'YdE $L+)tɭifrH BRe5RA >Ց%m9܆ZOɕ6`apEځIk]Ii\a}|2DŽHaWrrⳐk0Ngof+2|F &n\ `!%wR`d4yւJ#SH*#/زdM9p]PEɨ9TɂY%SẙT$>;NF2 Pݚ B.Hfb[Aq1tij*;WY6Ak8hw΄bj(]g_ƯENr`M 9]o4<#-R3ݭl1LTvGT!Ea9-x|/3&.G)D}"(Sbr48hhV0Kʲ ԅLB4)ѩ8aEQiQai2C2 ΤF r|#Q 1' ~h˥ŢT&TYiFÃ%*9%hӋ":ct.!Za6w0 Tiy>Z -Abվ St,YaEap!iZ5] "G&myku‚`v`w` +$ݱ-lui6O plto%! CT(ٝ,DgÕ+v;IF y3TXgoc11CuգpZomy!!V\LNw`'0,hCouOJBfhQdtCLWggv4iM13r{Qֱߒ:GaڑQ3 I6F"ߝNA+~ ޕI?-#:ww?M<3zyKVLĴˉ\f&M„Lr,ZO(hWlZ!7f8O'NaJgӧ[|xx8GzZjЇ }F1L*yB(@R¦ѫ"7Vyi/WmK*Ǧ\o>R^^Ω_vĄ~?.N&!M'2|Q|'9j2k&^琰C]K3o;"FanG hi2T vQ` {wT-F%NjgpГ |q.ɕvƁ_JeSRJ[;+ 'l(,UGop0Ռ~~;TXo(|O4'yG{DrCȞϿЉ'YBT(͏g'ݿ~NN?mơyWjve⩗-;rl\%y~}NZɓ؋wTc7W!>>08}OIT4 $"o1h)wR-$>9A+/ vzG(x]Fٟ_]qVQRm2l $I?_$w/j M5X9.$ܟb6vie1%gũ=GH\QZ0io&׏~Gk)z>._%zqQɬe׿˷LAuapi\ ;ō:EhɽdH%u1r~5 KN=8AY>\a5[BKֲ3n5+nkJoڀ%)6%fuѸ%u4a:Ҳ1>Zb'Ɍ6AЗJt a%zK/~ޓFHN*у'.0hEh%r" 4$y4%/#V:) ~@F3!skŃA.nK,o+ɮAc Bd) f, lpaApuW B+4Nc߁7=Ah iܴVkQB 'LKeF2ܠ^8HN.|-60xPxpjo+_L@fQ#d?b_$LVR7m_:󇿢L^HJKg,V6w皼uubG& IO7S5}u2,=u!]h$RA0Msgc!F1XJ,\TFm5QP۵F)k}PE~K%T*L>P}M-AoԤmhf/55 ]zTh\/sWr{nևޠ1V==-~^gdHAC.adm9fr߀KZm}ld4 q[lA:2ѳ5[c)f^57zK=p`\s]ٞH2.*fnGt_YxpP*- u6-dz :ιBPra%Ă #R~Mli u@w~\;U7C\u/.Wsg-*6#9qc9I0 ӜviN )kn`("r-$ էˆubj|v2)(W3E.Sq -漢։-i4!Eǖ- "ceaa2B:29l\A^>?ZrYP[^Q}~4DԔ )i)"֯}52#W_ni3,J }̷›E'7b-XЧI>֒XW tV4Yyz`}f-U3Ϲ@7lЈp`BUt 0@#ӞLV1Jndh[g4etyܵzJ`z|';J>dk7o~|{sFES Gϙ#G"j IXBd@2] hͼhf+4&:v`k; v0FzL>6֒<QF 4Mg@U!W["qR1~mowC]zŻ ZrxnѣgG 8]XYg98VvO$]PjQrnbH@ )I \+$[=doLRR{3ɜd'5§6d wY3!cR(ݙ#^FHãІQK%#5%cT5Ͼc4s68=gү3{|U_xcR$6%}tHc3-BHla-Yk|hJHگ/%'F[ Ա|u}ޗ:71/:7ڹFYS-Nj_wi_S[nܣ&~9/: Uzߗ,4DhFph[\51q PAq@M(*/OX8O}=2=|EWbRǠa w•k[j\urL{uGAn)Fi:6cixhqJZIl䈦QMm*89b1Dh*oX\c#5Q4P%|mU%$B ҄Y 2ٴ <8Wu0};6Su7IGqŗD`oۚw]a황yTqCԱn(XBj+hLU+DZb(aTqJa} 0%YUƃ̇*.0w!3 ot~7b|怆 Ft4גƈYP<#l!珏V}{;kG Bh 9%h46uߧ}n T\G*H{gkP 6ԍyJ#pt*=Jlclo\FTZVxⵦZ?tî~S{q(=*O=րrHr$+|81"I R~Bh'JRP$)mGF8oJ3XBД\rq6?> (*ۛ%s2vl 2PtۚA6ZԪ5 q3+I !kkqonK7'd` gr<d>B&㞜  u6\A'/* ( Lm4?L0]? |S82d>AMeCSA*LIƙ1]J"1/nPāy 2BPn e2gI8}id+0d.J)W$qNd<=ɨ?88?#5k&3qAE z]7ZM,babz#5&߭R0od<| FL{75 os#,{vD !c Ԫqe(ndZ72ާzo)Xb)&*G1@ 3(1XPaxmBiI2@vK <LZ;BN$#|X5’!’=bu*jnh"/W)UTG|a2e8_< _vŗ'm/'oG`ȨzxŘ4a.9tYBlg(5*UUḞvl~i im ƥQeyd7> +lx,;;C>7s]1rڒrx3d>fVK/iWH?+_Xֆ F9e892@YVc&L 8}E|9˲GI[{ΫȌ dwLALq<_XG18{e͂hȡr"q*L P,fm}mv:iȐC߇Nکf,d>beb7-8gC)&=uw&h%YVp="iA-4*yeqyF[Qc/f|4|dRHٔQJ!婅'26|.F^"E B ԨRĘIӰM{vȞc4|A]*2٦-#9D}]Ua<\ /8֣eS !b+f]{RjLĦm +ETqukl`N s-VSL;1J}$`5d>"CeCZt-ӖCR! {1 Af9SJ9r%DhK9drH&5vQ1Vתwp$0@,Dbui)2F;GS { #y#/);crhXOIá9+e4HqIDHmSB|@y:m\Y jmh9,B:t2ȳ}A{vÓ/aq"/\^4˓|}=?i?OřksJF3Oû}:]eP{~Û5^_|{8v&wNoa8T^l(FWAkbPڷCkRAR-o\ZRݝGkR-VOઍ8xwoͼOw<|3~վ?L4w3VA^~us=\tvr7~nܴI#I^UkOEn=mu(um >fm;'y~|>/9ҝ?[%_tv*$4.P1?ƃ VҒ41A!rԆ]lV*娗(G=v%ի$̰rx٘4cKӱX4dWxuϻV,}õ/_?Y}&/YM~l`+>pvx}߉!xg[g4rז//aݱi˘.凳K}kx&ʿv=Si=a-oEX&?mF>Y]ej St _z{%>QHk:kǻO߲o8ub4ŠsysY.-)hh`Zף˛=i9,C5d6zKun#nM­'#W)m=m!Ғý}rʴz=p'%;ooO 2d>aK UŇLCV ޥa4r|B8B&Cƃ!!VW{]e$" A7dvBf  w2oR\ײqSZO;iY 20)V 2sc̣gp, !Br WKNFjT/S=Gh+HkXu{嗜 56i`&B9MBH>@O^iЕXڊw[7cBn#^ |uA*]S5C׍P*w^_B+5"kztiwԅJ-8PiiC|4Se}JvblX;bݔ'b֠ )hX_|'j*ئ $諻|MFfa1:ɾB- vΆv ;?5`dE,cOd9d>$`Ҷr;LbI:/Cjp/MҊa>d6L)5 maSgrm:A/ v&cZVi=y@y'S XP7SL 2!OXȕFEr,K{g:Me4|bV5 @SHZ(ڃkZkk!CA#a@l㋄{sHVhǩ[Y)e{,|c }zV/3)vdW*NL9  ;bh˽`ɜz)$լԈƤMY 쳤VI'J%1x:'2IYf{,ZK+};驋p>Ni*F* Mbqt+Cu e+1:&Q+G٬;A~ %mm}\^fu&:VZa0=NJd64amQ AgτLР#Zx5>'/|X :Q\}#/hvCųK B i!d{/SHIM7{M;FA#e6+e-Zn-Rr۟Fʺ^-AbGyAf#eqYnNSܭs1|pmJ_[5-Z7,@>۫ĨV^8\FRu2Bdr6Y}UiXlF`^,Wɉp5б< ZNNc8_Mꠑd\ Vi=ϫgȑ"4-{l f7L0]ȒW-9 "%z%1@ƶZ*X5m7sٟz𣿺c0)*\$A%&+0iFw3kRr*me`wL Cgsr9~DʢG"7ƱMK+.u5rȉvEg|:y6GTl^IX#+GϿalyh'CtG= " G^p֢[ e^3fO87 2x=$I6?m h]Y9.?;} nbMXHT46+ғ[y)C`7Ee:aٺeYE*.ـP$1)[y1v CzE&"jX%\l\l]E f%0$lA^>p ­ˀNphƪ<{>i*6 Ϥ!Am,UfTcլ*9F)(()CԔ!P؛* K?+b-'ʔxIH(EE@(Mm6Ohjjgi4[P 7_Q`0P$SJ1[HL]E D[WAGMh6Ms ~.=iƕXK[S+߶V^Na %Vإ]إ`Rۊ9qC3Y=,lSfƎmWgd)J6,)1"/(FKHob4t4vh^ى䷧QpY.ZNCCUšΡ !So(41 Th$faRe8T@{kַ'GdR'JHX!CcIl4Y"p87(gmB9$иuD*2z@ hg!mn̗s&-`,i 'T9,!/}kX︬=VA>\IY e!m{-iQ drWf^H&{ƈ䞢o {=/h-Z)gs-܀Uϻ%Q:sqy#AD} [øA!yz;`3Z/]p2Gix\K{Zua27N磃Ci+9?Zy ]m'M M0K_WP`BaA N}Tw_5 $ix&G8x0n Atϛx0nx&G(Y/\ǕpNqVz5#('_z˂|n+&xa'a3lk߲v6@M_U [sE -dQ_0mx-3v4X̩V_(1&tc_ka7> Lp|JA|Kut̨uyzcfm曒҆;wZ+ ,ר (1CRFbi^mki{&Qai.VN֙/NoŤF; RقƭmQnCAbd^Zqv,6M0^/6XƀN09p!͹~s8(M?j.zK(Px294U-mҿw \v _+}r,BO\P[$Ɋc]>U.44J8aZNs~MgK.>i͎|*eMd.>a;ײ+L$Rz/-3K+eS;#vqm8o3&LϬ^33;>X9lgnj )qbN&a؟%YvlvBQ $0т_h8-|sHPBʧM-e'&8I^d~8:÷˙=. _޽B%Rw7|HfF{<Ʌ$:5)eHHQapCb1Ibp3SI8L"BMeP =_#`|Z+bnukk~[O@iv Vsg-:7fd[M1i4mMݚt&ɫͯpf)Ja^ďv ٛg[]X|*nIy%\|v͍8qV2*&^#W|lwBHXo;lky]}{[hxjJ~qF\t L %&TJ.։E-'[%^"5Y%0~ *˫*eVZR^ jJDEġD4VX%UQK%udE6D+]ӊ#$k_qj^q2]q #1L{KMeա\/A!l5?Qr5U?JV\vOŒ>IM)k3y'GVx)Xӿ d4-ۺwxcwyŻ ~7^1"| ^10LK_*p *_pitmĤfu9>0u;9Bu_1AD'qdD_7PEp~@W}9oUy{F%G m;ÈN]t +fh}z4\i"zUoc/ohz%-nM/-o^oit'?:y~,I]ir\Чn Rq$$L*%5\!QqQ\+$D!6FTI?B3EVOk`knYz`K~m;S]$$ y?_QPWLKWnS;BnU٠7ti˽W-{JrZh_ 0T%݋jWǿho[OO6~"M!t.D "ֹY=_qbPϖ=,{Is5ϙI7hHKD,Me"Dƚ)4ح01LOJ%45NLD$0Fk #&2I$F 81HX^kwüg dnZ4Cf=QZJ1b磴cmr ER_zHi 7z麭dᓊ#+V2x1bE&Ed݋`]|\ˢ^#:(1W KHU+µh$+s.0]S L܋ww$n]T\>&y%&LO?īGRյ7Sk7#^zHՒ;MiDPRJ:j͎ )T5 HU[i њ'^37~pf`?Jqw^U^ H_0) )9QH)·R.'K)+X馚#c9RBĴ?%HM#LHX U5fu,˷0ZѳTDe>6kSTGƥf5w0Il2>B`ƒ ˆH[BnkLF\,d[$Zco.b+65"`D?`D:P>ހ%6Uql7߽>Y:U;Tz2,CNk_9i"J+Ͳ]ڲ@g,]R>3%;i^>} 9#Vl W1%zM:R3^o&Nj0BJ^qXwlIY,@/(4B!Z W|V"s)d["p+ , #`$?Y pޞhCۓ(K'{z*/Vhh2dawVRiˤ8籆$3?3D^~jǝv 8fPmS/уxpt{58wnŃ?}T(~7й~rJQlO*SVܯbSeyq>|agadֆw"̾BOos#ӿ,BgJG97{um 8'rx]Do'ެ(xJ/V)=kbdnuzW^1fZۺ͈e@#徟 XξQWهdTüf f,;yxwzY4/s2S; ѻ9ݡsb|k qF.mep ubS 4BpP3AQ  e >RJ7޿< H>cڷ6wFmpn*w ݀.{8c AR#pq*bp?j)" N c4\;_š$4!*(֡ Bl"Jݜv & cŔ1K%v}9IzaX(LRB cN4ֱ,7Ĥ(2Ĝ%q}6 BD\mC!$1IJR:Ib% ƌ%@7b-SL44C!C"D UTk|9M`R=Kiy l$ @#Bȇ`$K`8 2»/F=-a$Rۤ3cJ ?HCշ=Y(NPR: JU$JKc;s5̸M)+QИd06:eZIPt$)SՇSº,?TD1S tղeq(hB"݂6Ls9IiT*)5\*㇙)?ާ\q15ZErlMJڸ<<,ق@xfh+I*޼˘맧,1KƙlY*O`ɐ>% V+s\ֆys+ɳ*K'jjdygX3FDE+Q\mtY kAޙ@xn ~L>Aэ %j'X5E_Gqu4~1Ja Ҳ3&XaS9Q+Q49KBL }T[2IeSq%B[b>I+}1y⢗cHJ4ap1(#,m\rU$l,[ H #+l -D; T{R]J };a*I%CLxeHl 0`%'nEAE E'+9+0O=4 5h;v።  <(EJ<ԪJ\4TKQv`2Ur9nppR( 3Cu.7,3Ф~ΖJ-ޛϰdD]`)m WRp4Ȓp *E{eoh_u+:C6 9R={TQAbɶ1R)~W*ABb|)QNKJDjL]V+P "v'ІUP:o3\+꼵 E Lo ϔA QLKQD;fcAQBu"tF0`ߧ{~0ݎ럭?i[2f=f3w ڦLǀ1^-*#mw%ׁі®2@fa1[4;ඃ5F PA-)BJ$RDO+Ϋ2f䓢ZQ0vFSa^|$/yѭՙ7C6t m*z ,\UBNU~t}%j*l+dəj&@Tv*@?'nΣO*q>e[&J,|˅.EpjI7k ) d L`91]$\ptBv,( d]BxpjwXbU3`pRꆢ%2&}L(Y%,h-fl9M@J6?@ :ʚ& Ȼrd-R)Ԥފ"n[BZBVH"t0U}Aw^KTLCb6mp;yM ˋO<^./zϹI4h0qfQ:Z 6Q\C٘"=uwO(x(r2hiŰƬT-G(ѲZth ƤyWзUfU(=x7LJxKtlЦĐ 9P.3TQ'%\`NvP , )!KHJYއ oܮf0l + MJ*UJ\$\[a~{?M^2 axaO1'J- 5J>HԜ0ۻ:H1-0'GF$ :QNМڙyΘVch% B%T?3NFR!W8-LB欭O?uNnY%|+M?jL1A%V$p'Qq(b%(a VIX ̀~"2*\H &hJ7a#ȭ.Sat'aPR+I t Q8A.:㽱l,1*h]CC-2c4&J9 8q;ub鱛HaMeQ #`B?1 ՠ,m7-R;2o혪ǰ`S:XF.TÒa~xW1lǔb솬>6_uf4nvEη7 //yE/5/7;lVѭ7e\}<2wq6b+V˳S܉,@4k.nWoX+ @\٧ǞmyhHvY_ᑷS xv >{{P M=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:5CvJߨ^uS 0Z^{PQRA?`PAAAAAAAAAAAAAAAAAAAAAAAyAɇ)u$NPpLP@WJzPԁuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPu\}vS0畠|xfO?se^ՋgyBjX:Wkxfq5{ eXcE^+/~J Pybq=$y> ?]^|D O U0?ru,ۣgz9DءIH>8Zv]U=*RK`hB8ldq N& ͺeTa|RCn 0[Y ^RdE0:4S-_wH|c_)L/ 6=O"7i99f%XǼ-_\___~<ߟ u^엋웛0{/Z }*e{ OEYm&j댞XYD >ȺDJOed)DjҚ2Dv"`yYJX ݧ`Kvo /%n[vq3w7v/FÑz"`y`=d>H@dj0 KFz&X;1pn"`=/ CB Ž`É0:XiSN6 x"`8-D㌔k2h7MŴ;``;UDXa,M"0JeDjKKV55V0 aYn*#VMlf`%:u{a`uXm<,YXp0SY0S)[NgZ>,XuV:v[3DjGBNedAʉ5<&Tj*5냒j*#o{PzGDmtXX#=;qL,` Xlkvk#3u!'V3؀lwK'Ry Z8pSeͫ CcM׷hy]p7%/=yqz1qWfy"!iW{DJ@ E );X#HZatTF ,t"`p }Վ(s"#k&V /,;ݓ3S?m[CpFyfy?[^]śCvl|[lPr-Vj"`I y`;Xy >`-#|a7_../nO}v_|>? s=>]s{5s5|XK~J({ .Ts٤ɋyfVӛ͎gG/\jYW_\J({,ߡ͡rxb{=h3x~휷m6~8ٷ뇆~lgg~y .ݳifۛr7oHooA90;Sķs\m<۬o,o{rtM*g79B?q sɒ~Ot.~?|s}^f~A(~>~0>SU^m*b] <|z`>_0, $/w ^86?G & 8G,[2{,Zbot&'q,rz]>&4r=">v<}#MO(źb)YSCN',iF6l?IfU4Pz[%o8BʹR&>jAUʗ5NOvGNz|ӳ6ՁV P԰dW\ w2Pevj>4V&pK!BMT1GͭU̒͵`2X5mpC׶WzNl'$Z m̖^e}uj*zwl)Ujkbl5M)ՙ&Zj; Pb4viώhNz ds1׼vNj{))j򳯀LDk31z\V'.YP[6JI3ÀkHKTJN^nM":IFܥ3xP +K~B10XUΦ{ p5{WߨغGJs`(D"_QWl{&tmn.Si_!MqW*]5gDyF䜜9,JOsP(=7'!hYUEr'oj*FS $]1JZک=\gQsCma:F7$sҩYJ+)&qu~0`ml^cY * JJH$j~I.8fk}3\H@eUi,֓Z jHQ!Օo*$66zSr +7%CQ|` `֑JwbP=5 Cm.5뎺0Q)^f&P+y&dYیJcɠ@Ә^ X70:-B \ 4sXnmfq^ f &Jr,/H{ : Ib1 ı&6VW}쪌fa\Zjn\@WgDj!/ Cbr\`bWհ֞j# m "HP;wMAQ lJ=2fTICY;wvFWVنp5TMX9M IAd6/ Nf#>Id|\#J}S:n3w%ez|=Bc9LB1mv^FoW0!ЮmN>K*T ++#$f AX'뭋HX )  r6G%i=+FQc,d֜M@5<2o_ ֎@A.JhSQ(AI!0E>$D5 )x Q,w_@*A d*3;(Q.QQ&B ¬q͈RH6Yڐ!Hn|F)AETge] %I[|Ϩ`+/YYi LQt#vgm@ PeXZ4ǐP B@X[ ID:30(rr~3RUrǬ8Yp>Q )P;,'!i:0g ^{@#AKzvC즯Az1΁6DR7_#}αc(!GmwA.B>aJ r[8EK ٖpNJ@A5HCWPX\ 5Nv{]Bp[#hu4~ 3BH25{{v5=KW:& HFi( V#;䍇""Lk'By8+ 1뮒|4, %bl+' WKŁkq_`!ud1M8MRH Pf% AT+u[U y/3TD},%߬rVeYk WFJP0%\mIih|0.Jd\T-2 36ylfѓq@ظ*=uPpJv(u\kTr oު'bI6!+TOsXQW TN̵kNnI!GXѡz7(V2 -)eyBaȢi)7rQ"FP MW AUlU? #d+eX`j?&o?\^ܼ^ov[79y'Ԃ?k'J9q^"مh|r꧟=+!1jU'H$ȋt-a[~qzG)vV5ԽaK뷫%;9zsyLa1^~\mpٙ6T~ grmw7Zɑ nϾ׭l~ݵ1r/ɨL1GnPBϛnG}QcԑaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaF#uVH 2`1S޾֍:a#unFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaf:&8vK2]Qp}XQh7QGP2Ѩco4:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:èft{A /W?]NPx~Ojn[<|\gn?v漭~8\;;~آ\F6beq֟*2y!`^5dMT k1  y7ցdRRWۅZ-%u Q`"?;4Cwƶ㏿/]#UlR9?M| L֤n:`m:\k:Z>sH6uZҾ`zwW":fא)@ZNGBỖl%9F+W P'yi7Wym au?^>1I՟Ģ{(>R[y{Үzټ|m;Ry߽_f|{}EJ4'@>z)'-}/S6!\lU pѺQ?z=諩C}z_$|RZ @JBZѽ9Y`B:.%f[ XJYSBj kŀ^s K^  j)#6x q) ,By{Gtܑ2* F-L FN/Q,8XXYkX_] 7/l FI8.0V*1ͳ/+[X:$;Ey!`-/%4-%ff)1z)` LV&w Ctǽ&K `ǽNRƆ 8l~Xt55Jf!`ut;2/kGγQDBҁZs YV XE Yƻf{ X{ؚ#Wj&a!`W-E=%v| v8Xv#@n) ,p^ȑ/,3>·t XBp=2/+}Xr€uG泇RBzyRX.%f/% X[n|#uHNV[E V6[h}TΛxn;O|ﶯv볋k|ޝ${y(c$`F@`GZCŽ  ɏ?ٮkG t|ꂑBKo]﯋WVwwyo{wo%{]o#W~6`wIvA2LO>l2𴕖KnI;}*\æDp:G]=ӑϟuX^X0Gx~V/sG]5kjd(~|TfG7d6litD)l7]t+x~`6n{5=t1W2g\Y֡ مՅMtoG~/E/&_;u :$`g{n'%BՋ&c +>־|~u4>^\{X!ĤK9PYL(,9kvXn.Ȑ51:J5*hY(GE?"yAp `*!HI:i9Ĕ=&'C"4#`Aʜ5A&8NQ9~vҪ#ËQlPk"NCQs1cp|"5d^ 䐣}w`zS2;n:u1~T9cr dMTMl.6\U2>KM4~+bq6gW;b48 |5 }8`Oݹ r&^%>NC"~EmGxWvdW+j;q+wqPe'|'cS[v0,w.؉լ)_oo}x? Xtn&b 1D-黩fu6&j܊7 I߾zI'7h뭮b+8]:loަCQȵ/4BN poSSMciUk kdm?N\m&wfM+bz<6m?>,Y8F_oкlr}U|O;KiBmOgӄU sx{U{mq "[f.W\^'dAu4TWFRƵu|ן'wƀg?g73#5! qRA!(_(88Km)fpk#M۟kiH*K'J(8F*DD]6,xQn|g[܏]=t1>>9=pEtrbٿWU_T}?kb\;fjhKx[voY0Yĺ-Bԣ3͇m}aIyϩٮަ?M[ʈvl"aC5npq S#g.Z"m:D(TP E0ADkmZ?N'Ʒ(uӀa3.Fmm/:zSCRژ5J<21,^ &p樦 %Ts0􎓰3rvӗ^G{??j7t{.oHX)V/zPTC3gʬg@kˊ;[d4H Q`q#X0A~>};>>k=>>>>I ^}+'_vR5*F?|:LK?WY%>X3}vAduwI̻+hDzi/A $`ʼSq X䔮y\3"iz@ƀrhF ”Æv19T &(tqs__v` 12 `Di&\Tm-,%SIGy>|h?mfYG*i &Aơ`R9u4G^vNk`70mAޠ yN@t4ķ: pS`1V?#7^4M#e`.^zG/{7hz/sS@!RCZhjEczSy䥮!C4yiP8FYBDFi|o-.'@Cu-x\'p˄VxR֝j?#ob5e$XqP_^Q&q'7i@1D>p1,7n)ryc9^t2-;:5ԋ^sSƫD'Y\\ohMhN{F5Z:LzN H t@$$ЫvrJv_yiS-=r?%ap!XկT HvU>\N@Nf ϑRDHӞ&b.C$FJ(ׄS#W]5}U|%8lj}Jt!`ļzEXt/\7g€VCx-Ï K=ZU( rMX.e]d"/@'dGٲsz9`6S}t|L8:( PF mtKV{{UF5VYb┌8K$$YH9d{5fTiJ:>0ug 8.fʀ `K s aV{;YQ;YE^*f~1&Z3 <#qJ+/PhliݶQGwbyȥ:ȫϏ@FM4PƴC{ NG[!mweѽk-L&`ysғ-mpz$=1vES'#7 cD+MF(%0 cQ̷>B,B1c4M2vkUՍxq+_+$nzJS:C"\ӭl$ǻ v4ǢqʖM ,)jʎ=~~p]J{b^z(F (J`KOM/ћl!/`4T^Q<cN`ܖMYe&k5;\RE$NS*;L=2(RcY{#q+FpQÒ4fvqfz%wjR]Iw/r9rBDZ )zEZr@2ѨTXVCfl-/3ǨSZ)!&v7lf9lFII2+RL'Z"v]K>:7´Z*!3 <gBrcĻbж "ΜJ56.Hcq<)בaPuN7X|EM1t NJdLE`ɟ օ)L[Na_EA̻%cΓ,VF *(1Z~?H C#Xv ֖ AsIr 6ևaZP}s;ozcv}I(ܕy_!ހOtL62wfe@ILscw"f:3f44Ȳ\ FWjIVC y(LgiMalvY8!uS&u('Iӄis ogmNy;hI4ȂW$ǩEvUcuI(LAyqb2fY@#XHBuE_(iWFnVwIL`LB f"i_)s]uaG(!K| z41z@a)UHX)-?a&#,I84Fkv ptX35_nIߚL|B+A=e`]cqxWzI2<s%Ve$S`Sg$eN!u}v(|<{|| y:aBjS޸z4#uj_ha pm`Te~&#pE!HHxc2]ų%Kˀ+oc=4 rKG^b0>?ŠIË>BL|}jWrTH Q4Y3?XsDb wFhɧ7w9If$e(0('&v Ce{Oe ^WB>ו'Z$ĕ~bщ-'L+q*K07Yi|E+flTmoCn!,8ݒVb;R9K=4NƤo>m,G{h<)C6u.L/q}Wl2 *WDn$ MYRPZ 0T67\շF  OaX"(DV"WQ#K47Oہi-|,o|i;,_Vv!Ʉ9.\o k#\#n;M*B8>+B!:pwB_7>sÕ֬%g̀HImLcw8:8uEA`FGʂr{#lľޡtȅz@4%` ¥4Y*NCU86 |,|6Φ/ p4_8J$* 1Lب4oO;8z |,˵ *;7E%IJ>vy9;c /yo1nȭJ-H*o7)]; Z*Fom]G |$e2_ί/b\=4m9Tr1 RƏQrn#ΡpVG4Y)2 hv{e{/Uu(j_74yE^].! Ϯ&_EE٘:>ſ&@49va?\"&)V]an SΥ;s?*WqnKi]]{Uݸk3~}֩MvTN|XB\|sAHI0a]4U4yiq{DCc}:,+׾̥[Xam(_vUzp@j[ 4֫3 [~^ʥ /s97?7\w5ZnW^-Xwwϣ;0M?P^$3)$y3t]`M~3m+lBV,Q3~hӾw _v2n잹/ v0ٴ-Eux]. ӵZNoۥV:\\[86v8ܛ*x.&7 iWLN6]ݬooe:a%WFgnZÞۼb]"70v{mËϞlw86{n\ޙܮu*h,U:_pʞh_ˇ\jc l۲ZܹOX_S ćQՑhGw?Wlca`eQsTsnU߂s\ǣ}-ϻ&{aa\_ߵ}z]c9~TW b?ҒK~ĕG|=3eE ʠ%~ڼZi#PvLzz=t V3 .8)+Sȃt o"E B8P 8@X):Pε^yuanŅWHK brmTi4W#^b̭PLNiE*m  8mLS:y0PMqY )Sl>||D|HlyUžLtaMcP\ #yNI%6{b赒Ejg&Mw LK*m+G`{b(2D7Yx%|=&6Mz->uBׂd2;DAb$uXlݫPJrrYo^H"'AK MeUwѶ  0֖u $X/.Ju9oogwMVTșVTʻg3٧aKyTu*M>PAƂ0~*o3\ K^P Vl(0^Zx$[4֛V#4:&}R9a,K&\hϵBXe.vB2.K0eDf^ndn/6 Vl(hGZr<\W\9yb㚔*+/=?5bNP /',e쑏kG꽋>҇+*G0\,c+Ƴ'b/;&8u+/Okoo+}U`vG:qLYbiB*N:r˕-Pwb-?:rrqGF}t\xrE(/~E\ 4<jJ_sAS|pE;9*Uw p/t\odH}Q?$p:|ŞbO=4N-/j@I3C#'"f݁SxcfgCq: E*:v {p˽˥/׫rJ~s_Ѹ *i qd8>1[_9giMD2>4j ;NPֆJt|5Dn5X6XAr)R|&kSVP&LJ-<*]>|ؓI>eLXvg*ř\q#bCTc0Sg ! |DPl; `P1 N*rg8IڜB$2a"YO~LnF )t5^R2 U3$\3$q -qI$t"fE"2!Ef%$Zp[- } ݳ[?8d`Ӎussb_ o#2lXf" XFRɒ|S EA_AϋoN4Z kDz\CUF4Z_ 0\TC.!DT!0VLaHReiZ< F3EA.U rCh aCN9nɨ'uy,^z,QXo0͘u gPFaFHv|CxXU"iOk·g: B m ɄG_q5 C6PsSi:^X A>ޜo ^݇@啠ʈѲrH >ZoV_^\}QL$[n- |,Մ ^*?Y者nŨo7ŀ{T=6M7 |K锩9d^.r允#Xcig__jN kcIvF#,\jDƁDzݐnW%|Ʉݍr,8q<,tU 7iw@Gr֐-B\JA8ޭbeR`qEZۈ]k>pMϖ/_atd{Η>pܪnanϮaop|/v />“1ۢ{ժ]_GuS>y?XT7?b[q\?~a\o;Y@&u3vd\]Y*Qr ~ˆ\;0A/wquwτ2qpdԟӡWۃ~*ͪjp? +x;&lbIxԘT\{\}.ou}=jZ/|APѦy3aoU]:hma/]_KM/w]1ޖmW.Bon?oـh'L^Y4/+Yz\NH](]o#Ww@qM&CndͶ%x&~%uK%bKNJUl*Nz FQj!0̡mG7lD~ꔓn^ϽVa2o3 $.fZkcPZ^[j Npirӗu 7 2`zeݮ-i^n?|-G._kK4Cmvqѯeȁ-rWkPWiV<< 8".8*cVq&5e6_R,u@NB*0BryuW_o(JmiJ rߣ*gVX6dG5ڡӔ~Mk]kmQ#&$VןvBA;H^B3x(R-ݨ (>)}M5debhCMpo%i{S_u䭗b14ӸTIwmB< rob129Sb6& ¸IU"oS||-#u5Te{QEVV5y- bY c5P)"Vgx5k׼]E(ip %Rksνdq]zO[Q7֝ Pŭ1e,Gt0 3@O~=l "D<Po:  Ԋ!jj514y*jjjA"Q1 RΤ(wؘx׮aAxЁ$bab ۯ6S`c0F?|umzPlDm"IU#""BpSV+T&o'unq7P]1) A8y ėWgn(uz *\DF j(XG28OH"D!`8b[NQA736uaVl <u8 コ:BgQYKiToy_V=y2s˛zxEג*oZ^"ox*)?347 \KKsӬ{(3qH]c^~>~|Y缕W<|=OYa <|_?^gQ]%n+¯&_i^V䯽TFR HsrU9LjʈCOBbhu[&ʘ<_૑ZHTwgOeUJ(J[īv/_W6.)Z&eby\Nk\>h XŸMjCݚ,4&C~{ ЋM3oROw"^KGgtWO77ȏ?%79;h>N6I?6sg~N;g|ymAwM$b!CZ>}7p%g kXa!Ea_:,/͡~t#49a<'\#% KĄ2 Ϸ5\4CX;|4<c}s<gprsXyxӭA8a6ehu3ykm>̷u:q 9 $ٹr֢oV)8S(E9N]n(f h0*OQ޽椷`obfiȈ5Q VHeN:41#m|2(~w~ e0fBc Rfe.Ey*@^/4BnJ= 0ibq-?!3-̼4j"'vP{Į>$5L[|}HB/.8 dk] A6+1TБL0U&:~:9H)`.EcjC0p0+ y|~8e}͈0Lrɤ(I 9d)2%4+x XbT cH$f eo:ܐתd=eB⩠6Oی ӅD WQvu  o B UjʡAV6f_"ʹݡ~T4.qn(bjr*BF3 ]Oo k!͘cm `@*Z)~T sa)CxB&NS0m@Ws7A4!T]RGiVG|hXh,D& jo|IFҶKtj.etM((DlvDk@vD&OjfVwLrìfԤ% Q14E^!Gٜup%: 5@cNE)I[۪2:H逌w#I =ͧ[_m'17Z72j4o{|}/ph@;ݞ ~v]:;Wծ5K:fk"A@I {"AIfV#,uo0) qo2HGĒ_~c red\/KM c;6c5$Fq]a_U:̯Eg0ÑqND6I!T_Q)JkC"k{PEI+ J?W_{cW5Cs6lv_5.7""6L(~20uubb 7S\QD^kPua:5.6WXʽ+ho7_ĵ]Ѩ܀4RHX2L+ ;((Y+5vc ;4[ {~0=9馹 6AP)2$(OSdʴ.  =UQgi9w@]O. *6w 5:L⸇;~ոȱ'Pvw_A8lyDQZ?SR Oe#hm ۀw#?[WLE h/RMaUeѽwPPNA&Ғ3dU0&p4 ii= YW 2_ ؀ӰS3bʕ|}mjj*t/ݼ8OKk9UǽnKb᪩|ZUw{^*9ty?gщ.d&JRωSHkI5XݢeDY ܫC7 D ܫ1*#&΁:w0F)̍*as XErT䉟㚹8A/78t<@Aܳ|x## vzwؗ|^'+0cOR:g?u(aĿF<ݰn2ۀc?_vf= >0JN(K)VwT'BLwA.'@V$̽&kph`vT wߏ~=8';&Nml*cɀ  gd(` eƢp?Y=dˊy)qr&WV#UruXxJj3q8%IBB{c%7qP#%85gp\`ޚCQ,dU6(*#sfÂ"΃GV2iks2>ca$eaeaƜCjPNSHjz 䈀ehL'ˤ@vUHy&I[EezBB͙'d:*dOi2YYƞCQL*#[Y=Nڗr٘ %Ep΂64#u\4 (dDiSUpL!!, 0m4%i0{B Wib⽔!Bić|Ϥ{&mmjtRmA4.q-Tau|?P*˱,-!N#Ha)l??vך (kE^`BӐd P!#CquWa a)Sjt}$ݥP\0&!Yhc4Pz^T|~\t*=8Nv.'rӥ~W48|Mf ;W7EB"SxcF5񒐜8*bĄX8( 9ڃ/A4f.UnyyމC)ϻ.v|39%YMtEtkgߞy{.}p8 Y GiA-PԀgOۢM17CXLi1g(L 7+=GwsoEnB%@fc (1>@c.r/T| `HI ؊@ǀ91m(m!e m 2$d")ch0 Cϴ#gZ|;{&8_Ծn6//$_wL u\=dBnj+i:x,Ɠq1>'=$E6Jcֻ?߶8E73Bdi&OapL»ql2m]={yҠNXn1Q.7N]Ң{'Ǘ#HRV3L'ŭ 9Wu0"-@0ر[J+}2Fb<:j agbGeTghvYKIFs UZ1X<V`]Wu^l" Z%%4pO180q:aA0F/I9 lq/=Mj$]3E:8rm.CtILl3q:|O8p!Zx,_LL{,҃}-:ogI=^eyǭ3q$3fLE5Zܭ}`L_ZP j]@O%F(dMl–5M|1"vn~aSΞׁ.ɣEzH @#AS#E+cmn γNw }OGla"QI 3CA,ֲŲ4'Y3 IaCwl2NhuR3=Ԓj3q:К{ِnt9=pL0`y'qL,C.qg4؝1BB4>{;G\l2^Q0ϗ0x9 SJ\>>Ē|]tQlըuiz2vq Ԫha7(}7a##$|Y|W}jߛby48txj$2q: ҁ!b$u 밫7C N i+54jG/'Ea QpL&zd2}Lgjc>?[$'h58# gH ?g`qd-21fEkmaG<kcdfYW-8:kp%gr9M{`FtW;78gZ+Ga-k)ִq%uR.*ݷHI(B u1. ;Fq2A7M^MGL!C/8lv6l@;@*~E3j+#^;eAP LP.hKVUK/W^Oo-;H'4{ ط1’bw`L }mqۀ_n˶{/0͒R._8`nvq!X<ƾN1Nfl($4A…9l&$=ζu$s~DݻWuOmܫ  8z&꾸iަXi'Gĉ{-$ 2OcڐuNg):FpkynDpIjnZE[" lReWHDuA9(9|*A#u|2w.)hv_ :gY4GIVÏu$o8Wh^31{oH{2{D_Ӹ@8) VV:Tq㕔1*__\s[dwkT^elDZ/ДBc*G&P8,Z )~*~ o!2MC?,|XD%q_R*>3Sυ] ZXreO?4p08) Mq1saa8|~zaGb*f]Ə2l&`|xlaFŁ6 %Fp|."*vRrlu00$6z~9Irv7ោ9§\=X^m`{E\:Vx0_R!o(yLgC69#`saʜP#j]Xct6JɯfJNmmAwN-nHJ(5c2m|o1amER~S밚(O%(Ev=_~~VzZD:)~NV(@Tf)T?/ˬjL[mC@g/4)i;?5fQmdhpltWUƟd6]4.Zf1)Cs/,!lf=r͛ x2tz P!t a챟7exԸ'X-p,6 4iOhx!LXA;sg~Hn#oF'~6f/?ٲg+ZfHW٢\;:_^EVEaY/ZыnnK۽pN_ʼnoFߕη",b[]ޮ1ds+cs[#>Gv>+tx'D7p91ԏ|4O~^Nꧾӷ! H wDJ)vG|sjvW_ %!;N ZTd fshZ#q -'A1BQujnLd`y-K._ZIg,W̅Z6=C.8 ! 9F-S FxԀcwEŧwD).3GBJI`SXcZd0?cڑ..yxл[#T8s,DVf 4Hxm?sqbrqV/o3_.F`C"k8ag9C8{rrOla'4bq>7^8C r Ϟ'iD? *4'Nj1wR'@W8l^0j$%OԖ>qvhna͠#{|Ӏgy ?>hjq@͘I&?Qja$F!`wviUOȵCt5[?=SQpuX1,H*z,fcTC/;;\VW5ys|MzTj *Oqw-nhixѠH⧼w;!7CWGm~bk*f\Z 2 Ԭn2srF4P3G˨m%B_qԟI zrYL :%PL<:`X?۴3!9hqJb u:z?f4zEz aXB R!*zȷ T@|R9cG~ĂgwW_g௺)|Kڧgߪ\jzgٸG\5WZMA ~k@hîu׀d0{cIBD$Yo$HFc~_н1AЗr1lQRJBwD9=WΈ[D3/З(SSq(^l6 ]4ttڄgDTy[XNeygM,{vxCI #J Id"4:%4A\ZnRNϱI~Yt}s~2?݃>A()/(K/.f˧kr~!Ek!0_}ܟ Yqr!}bF(O !3φrwARSuYTkVu%t<T]/_[7 [֗/E4PIJSzP@TC#L؉}~Yf"p|2tAZt${g\н'^ ȣ5ǐcƋ"? -e vq mmQa LN/s">zw=;pF"ӗV ջB^@/Zh@ Xa(DAlB$BXCo-n-NlzU߆,0gr3$^N@x_m`@\K%VH9!fL%)1%8hmMR73H]soDECFk}hkE vW&8Rٮsqμ11c̚g}nL3SlnVV)nUgTO>R) QNTC|-J8r9[|CXQ J\Ɨ>/FXs.Q4&.׿_S4VY#"nx<."rS|5oc7L#GMD* "b(DG8@t0Z]T. U>!"( &')@%GPzbPLsl"r/8!%qW0 ΝO MB_dXts2Ef$;0"[-FڂCR r xB`[;aߪ7xyݝ*Ww'1~yNIcേ:9'.(l0CE$QL)x(etF<(I|;_>ĩ]A{s{2$;-=qL;k8Jf?tae &~}D&dg [J$ 6;$'F+ay_ʏo\Q9ՓjPQɢ?QY빼(F8Eiwy^l>}bs>Z?~ hl[n, }:~:뗧jM (BJ@bVcM]F˨E`~{Q9|=dKifI}_t`ܴqsQ| ҭVqM'6v4$#;u~1Eu[Ve{5yP_ 0k_2c⠯ z %⟎(Q ۙGB3`pboC!tJ[Vx!U 3x"@ ;"ȴo#U6u=d Ph0xEϗ:"MP穝[0"Tl ۮ.oⷑu;]ۥTmP](q;D ZiSA&Je3 eGXV|@+eTé(z lV ۪%g8/[ b9v3! ݻIH'p^\x͢3sgI 8ܲCi9)ѡ'em/p>+&31j{طîT&]To|5hf#4ȋ.v,ZFu}Ad;Y$]Y&:O`}%'nu%pEҟ=#rxn ]1JYF hW!!rG NᲴyעΥUXG(ˉ*tԣ V<ɷ`\0^fmfiߧQd+k:Es;E/-/jͤ+zC>]}9f]LPneley \ b.Vꌧޖsߥ J]g")7TL?^=GSN])E#B}UҨ[W\ HWV__˄3\1Xȝbo&=Dg'Y s\)Emvs55ƴÃWMʻ)[> Hp4`0KAƮG?VoICmʩe˘ =#mm(vxz2{lf PtO~."n~&44:Sׁ?{}4'H EO. .Bixaj{_in/ IXlbd=VFg8wh_e^Vᔸ]n3Nz }$Yx*[2&g`j|6lOWCC.lq;: p<Mn8l?W5ȟAv6qx)kɲuivFƊf(5ܸDr‹oq̽׻7r k^=G? E3ufKţ[i<̡i,BWLT(|Zim|F1)]7okz+ ',oic(tOr|1]n'3[hH>jbPua9ݲO9](1 1"]q0 sB[h|S%{mg%>|'!X^h UW l0\\֟C+W LQ%)T9WjJ 1qЩ͓]O(aZZGIKDFz 0n ǗG-巫,}Pߥ͒TQ4FD8`R)fNn7_:\QkU".FV4h he .%3.`"8j ag:?nfC#tZk 0IP(h+Ap\RfXɽ_ҵB1;v!K)ݨ`5 Cj0=T-`nOw#}ooJJ"  ,(3$Tq|zlBh Ǎ *6q /N-FYz3)wt$w!t]^PQD"G "0=ZˇzAvZW`Die ԣXUI@CQ.jI#/8\sWk4I, CmFkvIplD)+qZ͐zvfbʫ+,}WBBXfJ)S dᥦD 0~fP).HM0`( .MrHd<0~fPٻ5 AKJ@#5F {ϔ*|ORIqCP1ywZ鷒PR#CHɜR0JFfptCq>z9k Pv#7#G<jkgd-&)6[;9a8b Wo*2XuqaV!j JQ#x{Ȱ'1~ }\/m!lNfD)D}sxx'{,qӀeFmz~Jw=%ͷ*[hddxSf:(ta@5F( Q w y:{LJ!AieS'cH2:]PV1Rr*FϹZh %AkgCL?} wvT IiK4Qu<fhQgذv+>yu%k0PS zs0(PRU뎡%^9X`?ﴢ_O> *3|ұdu+"r`Dz8yt7y:zrB}&z^n1y. |:4*;0-vRo&0s#d9 b6O&c+)uX scXpT&7@>>Cܭ;%B/S*Ⱥ7Cw%sqv6ÿngv=gbp]՟M~7I<}bpCDV(':^,?d7>q8ٱ?4n]}bHGXu#ij,U>HP{{-ga>O7B9e؍(Ն!TzCu>l[ccrɒcC{&nǤQ_`(EKYգQ]8S!BM2 pZ,,a=pּ 1H\LODH]I H@#P 7bJZ&aQTg/i\5pTh%, Š|PANb/!\\mMbk0 m$ʭ?dT^QJFfpw((tFfpt.AVG Ub)E 3Ei"Na0h͋x)-45`'.0X QR\E"E[hd'iA)V̀0!CٞkZp5@u;y]Y~D>) ,g*LXI=|Ffp0v(7T}u/R24PbBu,r]=cKmmETI<~n\t&Ѽ.i)pM}JG*h . ?u|CcÇY֎%Y"Z;{HDi eK`82FGy{ R_-*v6|MstVw:'_=lW˯S_HKyEqȔTʻRy73tэC?u73|xsCe-Dfg mRc\*庯pI#38h'>w\$= Oq|1CJZy+ UV DXѸ"ǔ5h^={l_ޟw\+[hd ^+A:m8N b*N p YJ@`D 3,t8DI= ntU9FfpP/vv*ⷣf@ɽ`R=83f3Iw^bD#t}5c[O]|pc 2a4dUq#QDGF#kg"fXbvzjV8>CaCp lu:q|3w=<3fQ|)1fsw{iJ>@U+,2UT "<#;ZyݶW~TxT:PO)$>B u^|*){ *?thzr d~ IJZ?`VA\WqRl!DԺPBb`u \9\=ZhHw`ڧCV2`-ya 818i5hW~y?t jw?* xxJ~igTuxM2v >f'CQFfp8~6I'jBexY| _竪q͓ {fMkoߏ~:~m9N߾5[J2۷`}?Py0kf&1kpJV05Ӂpa/|N^,2ჿ<c;ĺ`weDn5Y~N_.Lj>?,E=+f Gq= Qp V~SY_/c XXmYS_&"fk P|oBeZ\-ERw_|( Sn,>OfS]e.^u[$erO򷹮a|OGׁ`=\Vx47?_, LPeqc|y~!0k\4ws)|<*]`n4LL_0/(g)*ŕ.a],=-Ka Q)IǸUјGH=Rp%!2٤,wߗ\T G&C,"z3ϋM|wnQP|9c6AQ8!RyVdTJj^[󡘄]h<\'Ϣ{,'Ϣ{lI5S`c/cW9x&e[jQq1fZ[9b/^b/nLv`0n4out4չs3끰ʐ磓,Ζ~Y ~~+1#t:qgsWq Gq;WpH"@ ? >S(bѿ&w1G<û. "xc)GG4vP>`15\f5|[DҵsU_:URJvL׿):ֶô\“ \݀d2SsQɯ 3khHۿUcɝotFa^ .Fudx ^"#*hM-x󭡧h"c~y oNfRkd%&q_ӾMX-Z`zC]|3HF &xlэmQEk3?ӪX<Ј/.$ƏV#zlcm- x!9ɕ0b޵u,_!<$GK q`$A΋W0-2$KY53Hnr8܊8fA˞ݫ9ytz RʣUxm^}O ^Y֒ wxнk*NЍ'FwoKn=W/]S6 NRt "Gd k]ϸ.< 7B=?[3찉?w~[1xrt.E4TUZW|LdJq}[>p -+DKOmaO{֏zӟ-2[+6>["<^>6=>>:+<ݴ߬u1xVvMofAN'}Y-=* M7|kr vyވcS4'R,Sgg|bzqPk> u[)ifӧ5x6|k퓆D?__|3B-_zK޽c4atp|{tv0xܔ]n1Sp`2U&mc*뺱u1B)*{*n'*ښg'.KΝZ>g"=,3|A%hApHe|p=o7-xJ9s#>WB VzUb=ќb9ɼ؞󲆴3_ l.$5Uc3e{we {~k2CT!6̷ 'C͵Fj/zb+0=Eo\ 9}|~;~s'tMI\D?5hu}P}u=Ὗ Iƛ6PYG,۸)wz.\I;wLxn{f-\y+%rG|jl7c+4޿U|i{ eӬM_Y>ʨ^?!]yIW;w7z8.:jyZk:thonޟ/ ߯:^~oN@mjuߖWqd-Vy:OUIg.Yv?|)(|@_977:*3G.7yO6WlkHq2o|O7;<9Zqu|_қ[7X7;wzm.^G{M>Z(PWyu¸n>2m~[X !5 ~۶a|Zp8|_{]笔t浡{t >&Gl=?'kMSߤ|gOV (pJk:L )%5H cǔwLha01^#1>7T3F+@tRU=QKʇ<&<Lӽ/NIoM- m*F)]hGp#˙%cM!:F2U0 #&?bB1U6F 26{WߨGJ<8<ÄGF\RL?狟i&bTgQH֩ d*X !*ܽOI4e7-GpN#ϩRRnm5ljx$7.(]cp01+>]8g>I{]Rֹ[ʱ'Ws((E?`یU)kgXKEVCI 0ڨf!DH% u:o/-M񤽶jkT 9 QCBNFR #7gYB Q|n*@>c"0(!Q!sנ#Ҋ 6*˂ф h씱ёOL9 jSNG3@Ql|WZ `;o4PR6H2PbH2hˋucmlLfF\p-a4tGV9@f4R#uߑmʋ'U.(հQeC (TD@5w˽AQU6#Z}S s;'-]Ye; R5!c4)Ll`_A: \7qIsd|"J}W:a Wezm|#c9b2i2y v%4/j3xW4GH4M[A Zނ) %D&P+|tY\5:cb 褱#~7a쨘:?%Q TjM\@g"52X6V, T&/•!2:XASDC0GQF4hVϒ H Q?P@4: i0-ٕ/(!8j{VPR3@GAy|PR`:B¿T(`5J!p@ X*kOy{P(&C@X{@<3zac_.[[bA\Ϙ J(NcFzP B /axnm,G^$Yϫ5ܠa}Qkyb6#&d-$t>3xHu@Ky@^¼l<+MT ];-Zm*``f@yG aX@hVcA {=d>a(RA2QQ,bw W4m,Ѽy{:K%YF[[(w2x`񣇌 "YԏoT?XŸs 6j%^ A; >mIƏLJS|θB,UǑ+b FOfhP2Ǡ]Krm6^s@ QTɃwK }T@PF0x  7%^0aJp[SY)Ǝ a:f\%0/\qi.2ml@?XwyM`ttogid<A x ֵؓJC5t#as쬃U0mEAv0 )TPA>cX߼UɰlCVhVuSjCxr7pO9|Ǭ1TDyC*apH cUp#-Nh~֘-jʠtiLl"b9RQ Hh7 <@Ek5c$Ti[λ,?#t=}@F(G}`^ 0n△R;>3{yTڅhYr;}yWBb@kU *«VLUV9Ϥ?jҌn|c*VOK}XHPq~}*<$dI x>sԤ2)y1dEQ@(ۋ ^F{$Hf5wr=y'I a3M\L"HT*'ۄx ,'IȜEc*X2L&Y(74PrLPptOJzD[E#< bS_(34cxh {UK$j I1a]bskq3fsQfXL0xfƪS!"E#}W~7p++ g`z(Iqa 60 QlR^BdiP,HI,A G8ݹi#Ϧ袃B9Ik0]%s:r%˥p L}')vz(&Xn&eUQP,,%0=0zvo7C!KRC Lj|ˮ X.9L/4ISm.b*aJ9%d´(䡰,bٻ`lxM9hZ!lϩTkjWdd\G$+$DV&˜s/9ugf)%d%2,hP⳦a4ȂW h&#h`uV tpL{`{Qtj5Vi '0c$ U!h %+B)\f f91Y,1Ei.֎k,Sh?y_O!>McFOGa1ymWةw{t/^O| B.= 7%KU+eєx&kpt4oπ#j-A)fŐ@!` x  Y)ZYI35!p=2RsX-4ɞdu2*(KF[d"3x'Pt$/D%+"|Q)~dtdM[ 1^it# OTjb},FEV,9BvY!Z A5`l@((v6:F}L 5%osD$ r dPG eH4i"KͰ PbfU;TX!'(j>y~nUWͪenW2:M "= J(XBJtMc.قlWh(̪Y`?p@ .t+tnY*yV`Qr@}( ApO4o٩( PFb<^Q wPА%JGEQGU?:HjTaY)È!rtXƳ#6)Rl],Rj{`,<}Fpe$:Ұo`0[zƌz*YM쀥^ߟ&O)k a/j`EY>a?h.՞*#I_Ms4sC/~Tc$3YԑUsgTml?իsu~gNNXs>=Яg4n~ {or{0@jvr_M=˟E/:u<5痹ڳWaoZlBa&%Ldt؃FȈ&P'p.kܴa0gg|pl:`lZh,;`)ns _o$W /7!`msX9"tLI\hb C6rǰ5Xp7 n_`JB|7cyz8Ap}#0, )|Ccj'Kچy !pjgzP,>#S "Sk~:RhJXfe[{r5{]%-T5 >O?BcӌZ?^]Ix^ug[ μQ5Yl Q~2{n;~ݞrez+٪~OQ`tY'7ܜܜB N@V>s9oq 8,6op ]Q3xU>Lj;\#`vyԜEϺ ru}d❅90 ^^6LXj)5WԶas/l$[m8C:rWi=3_V8\4/,nK(:>fl St҄v܏zR6K@FYl "3yN\2I4̔ ji*-kDC_oE_0CQQԸq88i!6ɰAoty*̂(ӑki !sdr2$+.|%IAQWAeMeTϑ,1!~[`uޅ7uvHm׶=o )ZM'ε9K?Ru\#v4>b&JiQ!3'RDQDЍ^eǥJ/ UmmB;߮Ywы^ف2`J;HuH|G\+$!f<љD$ϑx ?Kc@:Ri '!xKLX 1M*Ru #A|3*ĒG}lX*xz|7pHYvv5f~z%Ϣq&]?T]Im%AR8u:0 Z\̪X+yd1,mP&В16x;km|{_Zb*Ř5%ʃT [9h.sHorL Ϙ"kb:zEMrU2§7zgj={I* ,T3G8!pQ/mAdKZ.]N ƶVNtV$(m| V1<2 P6l@RQm`kAD# l|+;/B,u2vA2,T 5+)+0):kfD1o hoc{fyjɤLR>z0}#oȤx XYɬ"Edd zˤߞ1tqYV1-ZkwUYlm蹘p'Jx3:䦣\6̓?Of :ߝdN0;<`.X֝# V'/,ON&ƴfI {nkԳҰmo FmK,M* ]wZ܂ t@p!oaIY%y#B=ˆ:n*65,ڡX2dl j IMI5ԔھdMVeB;mjQ)z:3T /l *vB:7:׮nP/9P. E o!I7 .̈",:4x}}͵TiǽD`4wJǃ.PM2.`=O3BsXN*񼬟9rs&|?>{3nw*k΋ ¦U'}GxRAˡ]B )4X in`Bz[ةV[$=UTsf5ڛ>)ӁA|DBH.9瞣>1TJ4R((gE4ێBk)C~tR tDeZ1P`1$1wu`s Fa"N.UꞩRٙgJ-oKh|0[rf,zFXoJ; HEY9yJ:ysݞc: CR63^u`~}zvCRޥ-!Ml,L$IǂEJ>: 9QXoSV.OKFn䑛S"s7eP AH ߍ/T1m[7R> 8~\^mP#O/Dm`N.XncI#c8F a!%dfH7e@h-*FFC4P`ښKS%JQ*d*yl Kpl8x8Ӛ ħgg+_/O}C@X#V5z,dY*N-KAC熿l\ag0O_@P`J_W^TVy>y$†$ DĝX@BNY /[׽Zdޏn+-vgU+@Nau8LO>fUr@zGeTWulZzoyxG|1΋/]'oc,H!FZoWRçӑ o?}&y `.0HP{14 b(^$f@FЄ!pm\~Ez?֫- ֘ >D!4Tr.V)% z0_VmsIcxƖ5Qm^)[ Mc%-z]6O][ߒz`[9f6=7KvTbVOOWγ;7VG3Vim-ӧ֞vr uN^!]Wb?yfWZ>xiY~N=~  9[r!}qm}\V6t>\c k$X8!|ʦ) ۼhn eK+c=m(%U?nV%FtAZI-%T% !UCV^mњwV^k#zUnzl }T۰Dߦ%:d'Hf_\=sFzܰ6U H 2nX2,ԁpC߶a]-k D]2"krl)c/aR< c DG,I!BHLRLqEw'DTBhОܪ [j@Dذ6>Pu]z!ꬩ|& R[,%)H"S;/RUsO=N& T;"8JLDH)05Y,*]n6-]"൜,v/AoP,SKKdcav!̉mJ]"2,ӓ;"3Xdc0`Q4Z]bRUԪmR$vㄹsp)`T",bY#T XsZZnp*j4Pau;KFu45xy &$Aа mvK.ܾø!X9[I"i5Pnzm5~Gק6s~L$riT.H$Ҏbh.QXlj7 ROKLj)y=Mn'i}56E?'M[*nn>fu6,-I'EgxbIY-Jy띦_^kDZ5a.lP r/ u˅4 ) :3ٷwa&z9ǙZs~des`vb3Opf8l"C?yL\`|kF)?~DSɘ\b& &s@ u ǂ* $e\ lMpeS1FHG$E2>.H-DK=;E`Zs'8h݉Ś]DͰ:e[pv SɮSDH{Jy s=衢Z-*:$F(h,4h1蔸U1H"mK𗲕w &SsckB-#$]=udsޟNT:Tc6P̜f;Ƀ&k(7|L+]B,v $iAxf1 pE"F>.$L%$l!X'c$cFP$aAq QFI{+f>')!ck\̝uI!%%BQ0,gU̽ 8RQ'2e} %"-{#՚!nC:pNz"-.wq/MoQ:PƄp©Iē\x=qh})1Kݓwq9Ql4!|En/B{~[爟Vo;cB'y0w'0':O>> }Uys>`(2E/`uq*5r<~UןܯO+e6D'2Z&1F3J,n9S$z]cT >Ho:T~l6Eq>|?  sܟZ=umQ7Fܫ+<'/Lio#ǏVd|$1b\zՃ:xNZ1΀5I WkR. uA/?M;4If4w#ME`l)]gv%?Jea94rrE V22A+ip\JXN`=|l? 4$F#ܗ6?*c!+x,#Y\ ut27䏓pS(lqE10d-`HND.osXZz f0.;Z:X墹_&JZZf&Bd!O/,`(!b43:9:Xǫ5ə/_A7qm s$FR[ 7TaʊyB2 䢶I.Q1B;f@?Nɋ;Yy/1d106؎%KN3AE"\``S) ,9} Y˴&c6DhIE)YbX'-uڕνP9422VH/ +o%e> 69ґ✶n:::nry y[x mЯ8mBŋǶ_xmd0B͸ګLX\g弍 l_]_F1V蒢 }uVr ?ҞSp/QrXfXBxrݠr2Jkiyꄷ6B͚T5)ug7 ͡9f#7qf߀Dqb4yGM0saM7%R9;'=6 gP"NE:U^^XgP.ӥs%(FGτV()5E8DL!L(D @S:_r_7{wGۇw}qM3s4TͰb;|8ԫ._+ޢԡPs̸q=r$$5.&2tWN4JZ5#iK)lhi$JEuEL2T.ô4ɺ>A@tڑH\ Ot42GX\qvKZi1H 5EL}/'80qg> , ]ah]jIЗt}|n#qyeb:z,)>\MN\!%in%i[EbrScAőkqճZ+>z^+9-uTvzԓ7q|{_]gմF0{ϧ |,jGB?Xz{A>\P !{Çt&[1:w40LW5kgenk|"ڍ( V;%ӄ7ne3n:UtMPxP"p@F m+G*~hCb< fz_GrKN{߷J%}o":%HHW%x,J&Q5~'8{ dv݌FYF8ʏj9 Ïvt-? .N( ɳz54/бscUSSj?CN幧J eZXyi #=`eʺ颢,m)V6;= ~7|e뙃&Ñ"`8~!*Xl0M -Z9F`Fg]1tEP8~DR͖u9809FG4~4x5ikE!*I@iF'K6Lb꼯lɩ`jV儝v 3$4 5i@F3?V9;I~Hk;Zr-IRˠ(j!mY> ~yC sFؚU5yx࢙?CMԦ%&2KigH޺8z%GQM'7 r,Ac@S(gH)ђhx1;N+{@g #;p!xB2n&+0iL6?Gq8x)g<M^EU5';1͑0|0_,BIHsqK63s*OK^Lk8,IAIaI1hmWOK>hf4xqZ՜=kɘRs$U)!axGs{2uc^ @e;\' #C՞U \՘`P5UQ21ϑ0a'PZ &r!ZcjQs < M!ax ;|t9MI@&vWL4ϑ0<]̤Z7p kR9FwY_JM*heu1F9|`Bq˚ S#,R| #axYH ނE'zVZ%CZYpPs%)3$] Al/IahdG10p}vLL8kJFwR"_;D #RTvMM&'Ab2*e*yS{s$ov: _HXj29@>UϹ. #3վš j Pr)]5;X]M$Α0<7zxW5ig ʖja"#ax'm[sΰM/J[9r 9s$ aTǴL#%LͰ:T:K*2es$m曃bEI"gWM`966?GqkV1()+sq2Ře C[Q>n(J*skB G&9F\N|TSc$AܧuHdƓMsx"5x"!OO4S0" 'S9ˋ_MI mہ?{br.f^-_-7J/zW[߯?r`ƹJssZsVD@G8W1: weI]Is-ғwWVƣٽ +ip㮸ACqWZO]I+>]yU9+ՊRyɿ/߽zfuqt>7~Ao𸥝!+w樛C[l%a_`{s6EʎirSmz\gvO.ޜr883 vR$g-S,DZi5RuњܐM{FASl}4: u\SmYfo,M epa͙6JVaf]ɼVWJ=Kxk5o[nrzr:ef-yF!2+/L38f lBʵSՅ#Saa-_mhcߐZ;Nv!B΃MC55޶P qINH6vφҎvǴG0伻emosܸɨ $or7aЧn&ldVm|];rߋKtMi_1Fv]_]&7H'׻z3sȻtP{dGF JՋK_VsRx0S2as*lwvEg\;ra>g^^s&$H*$ +ZJԪDwcGpCbt, t?rTGn>4ݷ";X2mḞiɯ9nxn'˲97lqr4}m'';yufOvlgz|gI_CE4hQ[4?K h.+~mE-NQkVӃV_=#r {?CH f8ѫ̀Xs'߾YĮ;'R11)yt>c=nc%~N[>#>  Mr !k6krn"}!BfCSg~LVy=hnޱ wY6|m}?a[+9/pr/N7xI//ZoXH14fV57˅*\/fpR )ZȲl3#|$hK.gnSظZTjw@)[5hΡ  -\[hѸ&:дlP+Z*8ڻjcV+^Fd~_[KԨ1l%𽥦 h8=l)eѦ[%K uI\=~46+!B{ aRc)>8pTM,s}C^ΆQ2JU}^-7ZvEG)G I^ɾcdmdHܫX*1!zKrGw.hWmELͱbmlӽ!3Q#ccĞ9j܁'Oinnvg'8HirfުVU&m͘T+Q=M"6NjQZ\P9ՕBE]QI1,;cV1+fE4cL,1<X`c :Mܓ|mp<6+%X!u)dh+PBC6*; 梃K\$[uΊ֜ t[ B\Y{ݷ&auY^-M\w&SK5auL+w97$W_E/{CSDká/>%ʝr%Or?{WXd%QO,s_觮F~`\l^$eVfOP%W2Eǥs Yb DYUfRQ03řʹ(2Xe཰@%DC:Y5Q C::=X]ķ8C?s>F#I.9iA(rccJQ ñ9˄8:#@s;4iN~DC dȸL > F39cR>TlFzL*>LZɫLz5 d2=vp=?l:r­56_%ߧJЯjqh&=E zlьRkw,s,c'aQ A`Lm8NgܳVQ}q4 lOǛ!{18+ynЛm77J4ɷY)ݡsR76]g_Kuu=1P3Uݖ#se,0Ҹ뉸+3ƺC N=\:e*0PkuPDR刅Jfp;Qj̴1LϔI!iɻ3ܽww;[<3Yo &3Rm*s=f◷WCwO/hS4i$̭qqs^y_5De,+fi?)S5M@ч ]^u Pg/uch}"|?C'Dc[ȭu h[VKeոݰuW{7V]M&f͖5izszsT)ȶg.==[>C8 .t=j~7c ;izoM3_̦M1O[̺ih^]XRY)CM'̓gK=j;('ԫڰ ^.dӶ+: z>|4+wNVn,bG.,d5wEE WW}Izf"t`a~^T3x`UU q38#溨rtxVSzԅJku:i%5Ջfp[qR7 ׷󿿵ՆRKzfbtVBdt"weGdR0u4mU |g2Q{fQFalRY콡nςǾ/j_^"l:{[5jP{5'h26 (ZJ$O'5:W,'RHQ^%ǥo/r/۸HHi?Z珽?Ykm`@>zN 5.[:fE\,ԝ{"%Y'l7B y꙽zWNϣ5zB hDkDb1fSZiG۸_+ ke*k%U"qLYd$Jf;5b`j&`w+{%Pr#w -sL6^/fͤ6W={}$!zU4PK1TqU,3h_U5882XN'ǘv3DPTֻ Aۜ<3sT9P_Y__Y@KhUeN+#40vjD0#kXaEZkVt9>#jaZrFc1WEZG\f>rZ+AGdZ\q *:q ԴcTPӓW l;zu*}s:)+ sezWpi:"s=sU5n,H乛+R'4W39&sU^;:wsrbURf>Vp)Fd\qBicn\4s ͕4N?+R~ކ?v=[~C렦b rO 'ws B3MN*SR昩SVk˄ksW/Dٔ\hpU`a~SoQb&[01KAe-?b,1I1!L"V ]x%7D!]w-Qtǥbɢroȏ[u ZG߻{9&HFfݥax\](W'&\Oښ?jI[{ l*=[Ү%ʄH1')!Ls#:j #@eJXiHRK5\cR>F7[s>]+0LGV.qmHDHKhƗrT[)/|v@,Ֆ=ьrJNn(0օ)ȓΉl YK)xQ+E/#GUv:ۺT6qZ|SF@GT) /uZDŽsVJKyTG41ѧm8Ngܳ27孢z iٞF9CS Ao^ߜ(J7IZ1-.zXxX7Q9I<#LdG2W`R7ui d4 +E(ޱ$Et.ŕV:ʴ4ȱF?6>slowX{=bm 8;FE;y=O^,5e%nRIF@`(58Fses< .Ępf3$uR$ʲA ]P'3r D &3.TD*B-SE&΀ҥ]>ky6v ,ޅcp걝u+;mP,:ixDx2Tl:~unzrR (MA)[g_ۺH)\+( (е"]dun[un[9,5Tԏ)\Τ݄Oژicg{p6QQEoC):d6e ,QznZOI-`K)dbIdet&3#) j3.m(6\Kwr1ys7,msfoo?`@,/.Z9 D=Sw9ahv rh77/ґ{_ˬ7v;\0^#Mph{z\ٞI2kؽ q$ۭçݵf~.sk#wrq? O\Jm Wt| |hVܖ Yl8\X2j,v,2, wՁE莏+=2[I uc Kx`UU q38#溨rtxVSzԅJku:i%5Ջfp[qR7 ׷1HZk 謄5$D2>`L(2 *iǗ24vd^Rc썍 {Cݞ}_4HgP^GEtjO-DZr2OrY<}F~<)i3')$L`b [ooCwH=_ڦ2ESV0Q |}lׇEUomWVoZu5|^EJwmK_˾e>/î Cd?, ӤLR>63CxlZ46,S3>UwuՃ([޽&0BrF$\$LxEYOSѦy*%5Q(vtT"S$HgA-p7q4rvE/lbvMrˑ,N.F'DH& DQ=8g2pxAw,P$v͘ 5J+#xB]sJR9E%C┡>G`l :%*I% TiX;%fr/e+ UaY({Yxt: Y_۾1w}VO~[r2}0:G1f60y,ѸT^3A%Ĉ6`d1t liA`xL%!N -h h2y2S@(YdKD ȄD --aw/{yєuIRQ5D%UÒwr=u}}wVq{;]_N5}4@^nh XfIrpɤTZ h qE-BDsiRDqFVZ[P%#wDA{^@7Z([F::ܯmW>W8@nࣷ"VDRu7v6z }e6EWbaxmG)hz8>ƛܤ\Jy S>Tߛ|^^`SMfB!O*iJF eH9&yC~;&Dnydc00% GlK\[-=wB)04gi yVm*grʔnwP&?\^>52,z{ !ͳCӵ{eчrzNadY7=9ֹ%9玜0q%j _63fϘ~Ѳ9t _=g+3PMY7z)m[Be ݭ_t>;%)KbK- ݣ[Z\\.K!~|#ur)6DB-ME ?ygLDj6?Wo$7r|enҷkpqN*?m:2?8E> j %9ȂPh*!=A*򑅬 m$L[zf\GpzQ}m54ٚ{0G#jd!p$ @z} :iJ#)'vnswJ;}e-,[,U3t;nrX#On̮N.XdsM#E#HT )SF/!1 BlQ52J1AjkC#/it*Ř[d-źWG9PZsWfSܛ6Eg][dY*Zqq:wj` *TA2LvplFd5j4Ho[eo[mETBhО Ku)Z00cXbGbxBwmۖYVV:$뙗K4I\JzcZ%<8F"3ZUDIɊQz3wYi`>[9m[zlWdn"7ݏ޽ JA݅@ 'dh|nD,EzC R(b ZDkpV2Q"[5qj$:|_"#_T-p\6r8yf,l U}NRjgc-)sXZ ԨHKh8w.v.MyCT%1 )LpпU=jfvޛ)̗M:9^ Iup. fVD^_^TsD۩sR|:\,M.[zKF3V^ڝvfID3zݭ7EuيLtAF+?DuQ1jqpmЍKX{.[g% QBx}Q+o+"ezlid~j0VwH4B:RH5xbj'̡9Tb398d4F`H_ڥFZz{!Xkv%eZmiI-Z;GSuDGX3$(H9GhDRHg^ v~I`_X]ڸb[<oH}no.goՋE;i#?٫jںF.0ϞgFmhC<W.+ЊS Z׫UHf[Ҿs7E_ fhtt:dk!c6zMwhqSGak-sk9~%V/e @b-kMxy77V4oRɋ=~3/|͒5nrZʎgh[gEg,wfœ9I!vмk(KJs-KQ\J0 BBZdx7+36VM35K#xlqR眲dvw4^icVh~F4.~`K$XRQB'{5thqW7/a*(%+Q qőpC3]q37A]{BHu!]3:fP ٿ_3TCvy84Kp|~mE'?)]|S|7>tB(r}w9yj߸úpr}w0+R U|gpYnr#Orsݙ_xg}v:}y& >Ȟ?D. ߭8__ ߁N|ZEGw7QMYrf\QM{50[۪ ˪z<.\@E)B 2wƁ :3gQxPMg]rZⰃ&Bs;?CY ԁNva @)ڬjB!]N5w! o^G8ԛ&4p|2GWk00JfnFz{N/(Ebh18o3 RHF%S69k5I,w?o \{3x4V 8䂉>|8`JP@TJk+&Q#(Y)D<b\p" } FźEqsE%q֫v\}>dDWMxբE˷9/䴱y3+ܚ =k\w?E!z.5Zsݱ@5chP*M# uz)J!$HvdZ)CZgg6X[J؄Bd,֝q3JYX2,,_R(ckozN9crڃ#0ŖmRFDminۨ}VMIQ-OG'Q+p97'nmTF%r4IV<{) ԳfOpHGzx5dUjK;q9]]oŪ:Α,ߘ8ab-uz^i58~B5 P\H.csVap٧(A x%|+T7N`mD^%!PsypdxL]m me">e']U8GL`ǹ@mO LQ "QV?!Q6;_Uw62SrP,1 60,υhX"8#Wȕ]9pO?/LJ )r8 we_(,+t>qL=ޘqG4z"jA_mֽuv;čN6nY\|;!zz~[MC^\gB0Bb/V>{~~ZH= zRylzGkh.#ݫD\Zמn@jOҖ O}GۧA!"P5z+0xL /H$>GBJod҂_ׯׯׯRҔ 2琊R%ׅR!3kL*imꖹc9a cRر,|7+jB6*Z陎;o|2w9p!mt=r;Y#i))T:<[:j9(ؘᩘ. 6E\/DVso#Q4xm$,Z܁+C7" "[ç^&X\bv]*#YV{sru7Pγ-ŧ@X x]Ѩ4+qgV?yX5n8>Xdv{^xsƼ}UyX_+yCW65~tm4 & iMǭo__|'*C 9A KV5?-!H!<9Vd9dL9JzYtTfR[PiBAL#ᖕkr)w(ӞIH10jAWgmxFҟ߶c|,}ΐŃq;:l>~KaXQ <,s< #*( Z6ф?2YMCMy(N:ZX^'@c|5z6~s"~BiAil}fKHEiuS5GI 2@-[,Ŗ4O;<%]R5)Z)pQDgYF u:gDYD2%M-rl:+ c2U׏ tO?bV;|z5=)V>9FpgkN>XO&(ç%]M~CxoW{iY, V5Xx?Fnm\coi&F-Yg5qFbN&WM]o$6/g̬af<(Q)0(6 ^ .EZsEJ~Sń sU6,Zv) }/_o+;V3W\ jdP`Ř".K1WEZuEJ-z [1WE\!.\i8wsUTh,GtuplnKE Wo]\Fͼl#<̉@ hg>VI`2sBPɗ,mrMlx4礵T)`}'pVdYs1? \?bG\+ф2TK5hcÿ,-NotS/zH_$A˚m-keͶٖ5۲f[l˚mY-ke+J!L^kŸV^ke}M_+R \{е'1Sߒ_%$ߒ[KoIoJdo-[K֒do-[K֒do- *lmKĭ-qkK -O>l}>[l}>1RT2?pZqtGmUǧj a8:܋kR\Q[\eƭm3"1 8죫&~ \ ~:*c9**x .hsHZ$)Kyf $:,*)P )EL$Ƒ G392Y\U9Б6ot~tE^K;tGqs7W.s&Y kpÇ-#5d>\,~b>֣~.;XȝMJiܻvߺL~G1_ȅv=̯}{dMzݣ9!^IJXv>o +yYtog>d#޿sq=7^^/_Mس4tHڮ e5t6ǧlIq3;s/DHO-~tynXȠ}AF uު¬ز7MoRozq{N0ۛl*~c pspt]nQ^-Qf-~E5Qu蓮[4)fS)nڡWOXM)OԱݽjl;e+'Գڰpy~v6⢧71ʝp7K+ G""#:YeG*YzĽ'Xf#ǪnlXO\;S,"zmr1@Ÿ3p3&1J̗"2l8-vBͺQ4RDQvПVrT?Tvx;K{2c.MWc:Fk'r 6[ytr^RP^&E[9*< |~e2Q(kc?aoۓ& = V]j%jP{è'5}DT 5Zk,xO2OX-N&d7@j0`'1ƒQEQgz ձ6HuH)\Y3҄2T0̃ɜxJ{İvK熡Z8vֈvգ,Jy@@|+W*t˅<6Ve:FOC7/LBZp2hS>*Ij`e(iInT[NLRdFbr<Qo4(04ѣ&T>&~MYjBx9OT['/=Pgr>Qr[eϯwQ9U 1>(pKfb7k }Ԇ:hT$.2FP2:ǘF:L̂q4=)`$;Mt)2ږ8-nr[XmeʶP5[{Cpkqwp<$,n [lM62R'"@,{ykZ1),-@T”!"@^̦P:( 6t9V2 TohW ZVTںY`wv AV$_bPB;e  ~Әp<'JQg,UaJdP"֢a,:DDI'nB6 _p>fi@Rݝ8/ka<Xm}E4"6XXY(U&f 5\L>H*)E2! 堅 jx;.U4ikEkXJQ E]X]D-DdL $[Ͳ貱ǧ!|Af<)$If<{) Գ"z -@{M`vUڬa\Z>(_pfU+լ'GTG5 P\J-\ƩFX鼦 \)" *zUn$|rXœp-_h~Z.K Mt"AQOЌ_=l1I&YOg鄗-`w`gWEr$9g0}-[%f 8Uf}Xپ䋶J//# Bh$B8 d4^xFeg4h6ƹI(JʘT&H% }b)ÔT9ayxϚ&g/vM¡KD;N`%w.!ϟIQN\-8>JZInzɶޏ)WԋZ2I8JG^ĹԤ]á~jI^ux!/D\љKὧ4m:ϻmluf姩'@Ez\?}^; zM;mL_nELnFq z?k |^>(K>SMgcvx3z4<gn9.[ja0~[>bz)`% ]^%RJ3c+-%̺G$ YѪ$ʍs4rڂ(# D4h%O\5m;S9+X3~2pT*K;3H*KDW* O]eqwm^+VQvpY:z=pi~BpbWY\~2pGWYJ`dל \ei9vRj++AQ?z"_ou>w\$;_!өջ3!1MA+IT.y[Q4"y̩٢{;xGDp hz\.sA9ZhB/E-TpحLhED|Bγ^p7spתUv֛DJ*1`>4SNipԶ*6C`+7U nv:5n1r!*+L_ q1)Q++"@]Tm$"*NYվycrAbEp+B T=힗E,zk?U65s0yo+B;G/:!. .VS uXYJb],;3 ֐O^~ ؆1յ]D+Ǚ$%Loш@5? 6 z,Z0'$$D5 R8h3D$#/[Y*H\@D ُ )V8 gErT@ dZ_:I1qV{GdioTbkQw,W J=je1cYhM МX&jY Ʉs()om>Ml*ցvX7 ~۹cɹLw1Բ7KmUƒ0E,/ :"E*RHxb*'%s*yQ`'ySQFGυV()5E; D锐%4BH4O*xƽ]1Hפ&d*~׏LWseYn QQ=e+_f]dt1\K 9Iü y*q1YNAvd`e0? 1f!jNӖ1\ 8&H)7V'GBe^*Zd]Y LZ{4@7hdqXXyݒV"p$O}Y# |fgan/'aa?dž&߄űv}z%ACys}Jb:z "&<X !Aj28h;W%Y([fE4 L6s) ״Gx.BB>g]b ҎRRI"9~zpPIAu$*`WVACH*6o<4B"I4+ϻD' Tic5qDeuuմ^=`ӻ6m[>uز–U;[v>ۄkeTTzgt>tg,6?m.(}b[g.XegRK]m>B@?Ȁ`VG1D2.8x@̮\ Ymo5^y#μ!ݡ˦'ަhc?L-ja/Rf:(fgA#k{ ?v'ocar G"w }cV-lݺXjN:\+N3N1;; ]WoXRY)CN< ӂ)UbK*Wn,ږm]6CcB.6ybwa!MXh^Yef ^00c;]>fW<8P66lO:Kӽdچ;^\-bֈ ۄ#ڨq3M<}Bk>Nn8`JG  W0J #F:*Q)Q$wAo!ZQp7q%8Q~bVx13'G<'WfF2@paK%8 A1TBshF&JFbFl7hlٱ)j¨:ҒZΉr ~dтE*i[R)ՆxT;-(䨏)yT5Ydi/8"FBx4K&g5F4F- bq,;Dqٌe1^ 5&9m=QI%B"Ip>4SNioJKֿniv0 7&jXm._./t!Z%& ZAeVx f >ET68yIzAw l"ͅ9 R[ѣre$ЛreE\RH˕)reo\A-H`uoઈ+L_H xpUDlp , R WEڥ2 WEJ-\;N=ivP]_07?!)e@<Wv5Ny䑟`!zuxB?"Y L~vhj>𬢏9 c&:.zЉk%!'.rNv0Y;;` cOl ,? J;˜m{ϙ}=CqKۻ?DTR|BC(GBx=2vrg)sɸ+@0eRRF:uJuRv'2h0T2;[…(s@rGFan;Tph,$YD^3ܺ8MYY/`GHtKw83!1C ]KfE>O z,FzՇ8G#hҺnF+ϵz(xryv!hU)t55 ^}Ië ]WVo"4Bj4S>ngspM["mϫ _X%| ~Aj?m ZZhWŶ131-1I}ׯ SdpLh)Yl6ԙa  Z}V 3,*2 2ofRɜkRA"1#,/e@ (EiH,=GH1Hpjɮ&m(+=?·xcӸ}~W ̯ yJ)}6&TlduLt) ִGVxv!B>ҳq$$~,~R(ՙ(bvw-JɅ2y֯Z^5v4E`u@9Cߐǘu)1ol;3h(1+LB唗lkc]p˄fib)J;6̐䲌J*sap5Md*0\f`5q Nyƽw}Jipwu.Ǟ>FhE2]|]u].=d2d)d5 #EF_bRx"N,L*YΕ$HٓM&'ǵ**[p5q[<h4kzv7ʯ)>yg,iY?{u-E/r~ʚt?twGjF]ڐ.NtHfPeyt\ag5!;C+ͷlF풍h䅖 q$`As/D:$feNhlBJ@[6r?xv_.1.bgNSBDt%1DsT'UAyF8 Wxeٱ0ޓWXR\ږ^%u 9vsVwIƬt0gć̈Pmug:-w=hNEWZ |+x{!Ω0w &ub.B<^gLҚAʈLGg ZR䥚kybX>KㄮJjuJ.%Yf6Kr YVg.s.{gBU&3a(61{ 6vݏ0;/fm +^y>XOV<~{<_fz ~dzOxH,F%eR\!Bo]ьߓrxA]=}ru7zi8-7}f/3GszC+~Mο.~ytf}(xws@¯gEgݮ+9i;^SOӵnOeJ5E潟yc%ϭg͵~sϟ dfMㄼ#r:ai橿QO]]~H1׀   >:ٓˇc>x5_˷Gc9D;ځGzYF.f-ԇ͇hp{7Y\}~`P5ivsvsTc|XSfxXapt]n|c1-bvzn-冊\7|2a6򬛅g6f'6_C&´Fw]j,;e)&ԋڴopqVWlz~[󘏆rrㅅ̢MeB#:>LjkfLZxsd`GI ucÎ X4"ͣ!Wkhb&nF#Q)㙚#8p?v[źS*--E줱}Jk,j-o {o.*{;mT"0m<㸎I(т‚g]ArGfcRWɏ/2yid\J9r\1nO>M+)^^GEtX_-LF u//54Squ_:דK\_uՑv8S7nyg}yjtg܃ɢޙ Bgx̝v"DR-s19_Wdzzfm쬡E% @46JX.0Z3K?'PmMi+U+2lCl$$OvDi pѓ&TY&vMڅtl.a˦.dzzw^^ɨWOUQy{MޘHTȭ1~"rRv# 9j0 $dܠ k*s{f"`b\PZ eI2NQq%4َYYf싅2G{q|)5ܱ8z=? :7 nǓWبM6q"Ȩr\J A&re8J3%3U[R6(X) lJ ja1cRfAʈ]MGø\^ ںvcfdK@R}ȃ%*ߐ4.IkR+dU<̨ w L&KR" G(z1@NuҮvjléS+x)XM?.MCĆ4^E˴EkB4'%v鬳qTD|)G#ƔȣUΘ@q"mVdeQg|@RIq- 蠦_?Y]9{&c̠e'(GDσ6A0(@\ŨH些UH.HRd$ W \e $`rkBkZi&s픻gz8_! ~`8w' *1HYC. o5IkĖ8#g5$BTkCw[+-}g$BA[hag^8hlώA5bA@ z0I/}ԉj4:ƌY2E8HA 5\{ZUa RGL,*`eak$Eq(B4Ҕwq(:B%ס+ڒjy._>7$38`[(sh! &Hm]6֦RV>57`3H9c\p RoPDf~fܴx[K<m~WݛYڟG+gcIpலŵ^[_ n~417- b0wJW"lTCZ>DqR8֝O-2qP7%+"8>oi9)XU暮61COBD#0me Z("rKIPTOSbQss9[a)dG8c8Fg2,B$g:S"0n3>x.RY%g|0<XjЂKl%8ISH8avZ T)ݞ~ QPָKl&>/3TJU@  .3 (I ۋm6ʞA0Oy3i I"3bX*mъhD$H&H%Ĉ2'7%_x٬,U;2bU3s7)usadG/ T gaC11GFVDe1X@DY VUqNBuL%T8 `bj1E4Nyi})I?G,s$k:Qncf1)]}ņ&pc⤖?/dUY\X6KaglW9Zz2lӅYK#s"3!3e212m-2SOV3J=GE#匓F+Uȱ$DJg#UFA x$ )[X1Kc/豉hj4BZ"eZ&ƫJ?Vf-LT+.]7WvBR D]vcZuJ@bxEڠuOQt,ez]|6e-wlz2̔,k国䛭<AsRI9.p.Rw{+X ZvSYve=_2f˘c9˺:1cLL1q <`/*-jB$UGԦas׮&:^_Ȇm2.tah>I9x+6oQԘ/]Y;mדOdbr؅[&TG+e.-476t/I*ת Z+mqgƯ1WFmRF&3kU]LqY&be8L0ˆ7@>zY$d"iY@64a,(KיGG%$tyey7"5*CHF: lp6HF'g)Oc)Xem֡\qeԙz6tz$}ew.i/ }(q3mBw-hm__5p*uƣ vڜ>~3x0do8o)(6Z@i-UBA*Tߦ d@z+QmT&3F,RCSHݾ >)pdywQ03b#&@'18xTEi8賨se Ú1 .0AxfS"ap) ߛIc'p* "QuVn+5Q>^/!$,IVw],Lu2jkD )՜a&lEJRiLb"TzXpPF#YsI` ԪCu;"E RrHHwZI%pD0AP8@2\a/{,? 6EQ1k;cVkf]bXȘgz '"PUNÂ9!br ejL ›ڿ8O(bxvR×Swr;7$,̾~++$d#+,)[ Y1S:~t%p{>RJcg֝Qf_ wo'gl1K1&|c{q9['s;(nX.M<Iڑ_{.y9Ah-zɺQU9Qt$*Is>5r_<'w3996tJP m׍uk@~|u߿.WAdO:thM t%q_}6.t. >Ob"ܵ/D  Mq hXj#w%8 b0V4 @q pƾJaexPQR;6:oM~~0(g/ܨS$Pa2TL `H̴ByƱ),%`MlT9+Xzals~tRs#bF$ >BLXXGAp#ZB#01NC olOlR# QզGQ{~|^5)klvo[9d}֧%h-=}(*5yu |"LE>9\nF['MW<ܑKן<ڑ`{V4ؓm&T5F*6Oas%>+B$eB%2"  FP`z;B})n@ ()& zSNq4PfW12#XP1g&jZ26FjViL636aY(ZY㴛ûkRY]9ݰOAfz^7~KB2AL)ucF),0E)0D`5ktRE#iHƞ%AITgJʑ_ae9Tסs誹Ltw`THe&Oq%qy@eYA< 3}DfPDm56s sЦŮ6ۏGqb %ϵڦj7Z.w(tҊKVBTGN!E($4&IkREU{.x#\0@Dhk"$7A!pL%Ν:4 NڹpÎ/3x*|E4"63*Z6(ɃfpR]:s) A9hژ5PdtEQ4 7N@ ,j}0TRLdB@&A Cjo?_%\jvq,YpJjRL].6K>Rf>6k-t]EB J9 րK\jeQXbIljCϳ0a{k u{d\>' ~1ӌ(Ap-XHX)\4\).ݔ$IcG;ɓrr){) Գx_|*u:N&r;z@/;ET 5 ) f|L.#ʌ[K`gDb&R۹LmBch50/(s!KQR^>FIuz5 P\%9[K3Vap٧2 DzXC7[3e2m0G0˃LZ&E;yIÑ@cU#8wǽڃm4%NEGA/'E)gOARj(2Z/,h4nȚ J4 FG8R.t0F4ʐK {/J'_-Vk8Qc+tBOv>~JS﫳{bG @7}} kC8^v R K:/oFJX?6gQbuD ͨoP~[u:՟5(0_)m3@/htfVIx*5iBDw5X&,) /lMIIxu1(s7S t=Xˍ%w? 3ޒ'*C 9A(FƟ-\E~6Z~6~6j~60Ɔ  Z=bw6QI/2"lU uoe2 Y#idDmR6V0R\q8,=b<;`.FXՆaMm:Y^CUQVyoK@7ݦTf <,s< #*(<(h]r=D Cf ()o]7BOElɃdOf ^q>؀o9L]. jشd6d%Wwb"}HDC$1KoIܰ^'ӈKTZ/UH-IAJ"8˒6BL9#&"))hj=qۭΘЛN//W龊i;+`8.@|J ە-3 1u.Ilsчh2m|*qW`џKH OM0a؎̿_eu:wVH碲-m;C3^ZD2p6Ip:$epI(mI$Kꟼu˳N㼂#[eb q! HI@dp֧ոgj^'8L{8G*'Ƅu&}B "QVS+3( %`3ɪ<2a@Fh9EqJNST"mSk8=n#8˜.k˻9cij,_ls2$ Ab}4z}$z Lp}{A]=og nx'WkLZ0Pl ;nHIR9)sDΖ.왁^+$gRIkP c |4Zće?"٨hg:&QNro#A jܯ|ChӾpcaQUg7H3n  ){?)F_$ܷF}/P!6S"Ub&%8nMŖ<ջ>sCZ2(-bBM¨q8\Fp=l:2 4n)[JҸ4n)[JҸ4n)σXch)4n)[JҸ4n) cFc. /-qKiR-q fI[JgkTZJҸ4n)[JҸ4n)[Joeܕkv6yp=a|s3ǤE7]/(SQ'өwv1Ʋ@w(8PAï돟_Q n{GL^wHzRVZ#"?Džw(=֒XFe6irwCnEˡ%ݽ}f4MwGneL;^Nj{2ŋC}hͽ=o'}W je]͹4#ZR.$y"[> nLZ^٤vXϼW/Z%_i<~_%d{ H0;,'ɖؼJIHDJ vg>s|7ϏN5j53f?4enx<#]`KW'k/zе}҃޿8CW[VYNҭ0_lƍzk ʿW_d Z2S)$v|(U7?|;KW̕Ewе u1'i:-ޚYdHɣ?ȨÄqzsTy6:0/no\-~]2ch8α8m׮=]qJSW҇vV@MauEt]y%ɟƟGD֡sT>Sҥ^.=jd5S9[s/t<rn'wi2(%c&g0k*u>t"mi<^ ,Z0G2s\Xy_%c!)rBXh6c ;y^ I' )%G/|⮏lG_+ݨwRz|:^_><-z-hi򴬻x.wkW_"}hJMG|Ƚ=p_t{a >CސOɹ ha[q+ibZqVGnő[VD#_1RrSK:ӊ#ȭ8r+܊#7t@GʂR8rCF8r+܊#ȭ8r+ܐ3#pZȖwzEm9>P8Uh VG< ~,Ƭ*,;8qYTY!dQP 0m n)SS LP@ VGwGNrÝvr}vl4Cj!i~-zz١N&I*i'!hGJNQJF4zRӮdF2J1ec. @P`epe19A Ju5pkFD#-.VZ(î՛9EG,w,,-3݈e]\/{d<$CbFP 35*PʚxJ5$[bMF9mܲq-O[ .(MR$LZ4\\DI"Qq-\Rl(j{ΦPKʨR:> s7Q F+PYI1a[*ŇVk\" TwJʬI0jZ?}:5 6溃>,a[w=wdOE? ?{+uf"NIJKOP`ʼnec6fz1 k! bxx\-=:\Ur+R'=,yz z|;kEhOރ9i;+#]^?\0(ؚ`Kk2o{tO6ͮClI%YHknwM|Q/Gk-wC9 G摳|ONjsG5rWh;{Ds˦c/V7,Ogz8_% vGK̀p8ccog_*qM2IVO̐.#RH"[ ˭2A͋LR8 ny`4`Pi3;}/޼.꣊#*߽_ 7ɂIј͗3$ 8"K!$HP&.:*wL,8 QJ0}ZvaKq1-V@}u• Y_ Ҝk[>2!rxqpC?>9UŨF/RdbnUQDqQNW|n4x O?/jv^,ϴ¦JP /ਨ˗>8+ 牒=UEn-ma%^^[ )ӑjYoG4zկ=/ϊk{w7|q4icϐ\#T1 7.ةnpTqj>#V ZW+X߽F7 &mz>63 H5k_)89 %r_hj e )QQ&煞_ؒws$m튬u%j*hs9[K̰kO*}6td-{F8%8Jk/sx;IL;n|@9QۜhC0Rt,v=HVFӈ~ i6O 4fFR*c@ AG!Pǜ ۭ6s@=nc~À;m<8v( (u#MOwFΚyH|5A8 ѝҡrdKZqkVy^՛%i~M˲\jўhr?:/-KhP*;H eMu[<& v:/9gy )R'J3H,j(U(Ѣ DE=DcPȸGq`mCaO-e*Ur`}r1rVUh) PJHD`D$?=~kzi-j0 dSrm>hl!91j*EDt;Y串'Ա m^UbBHi;>bϿF\:qd_icXuhP0#7D$s?W1IDMR| k>䗲0LO?|Ӌ÷~];YM"@{p}]U߼kauה@n&|~)r-^Tۋ~+@; }q~L}:TWb׼)(a$L 4QQT )/? M)m ]B[1Sy\ٲ9US*թpv17Fn@u v}.ƾOJSmEx \xm~h=/OH?RAfK'DAKMYpU2q,ȒD[zPا#s\V6nytlǤ!%pA" /!Rz%KN.*83(8eGvho#?Mt^f:; bS{;hz8:q;kfi8+e-7{hC(#T't™21 sh݇}ľC8+XV Y ]slʝe)9kd1PͦjM ܇c slk]9S^^k1= nt.l=އMYC~c@vb#2+;SiNA{ BlƁ]:D";2슺Bj5ۮ2KsګoG]q.V~u tgU&]QW+B#Smբ{u( BQWHlWR%vu@ WՕk_%8pߞ|NzZ|WTTO8^`!Ө\ZTU<矓}׏Yvp;9N 2bwvLڙcLQ۾ dt}C;4LllKBF=~*u)7q5}ki8ov*]g\ͷ+S[n2-O1!^44yY˵pAb"&?ygLD3,|v8 Z0R PP-;7lUZ[is[8a$P}sQ`x_T+yvƭ)u$J`їV?9HoL}o(F3@7t&Y\$#^yxNzk43)r1GAn0ǃ@$SBe80(3Y5VHu mm"Y3Y_"rE7N'b.4urDO$  #R%dm4̑AhT"!FFC4R9Q`ښ%NLx- U\hZo9dzVplgkrb1E)Wiַ\/4UӡE6P`BS |f* ;/dR#D@(Md4'qG#زǖ=RlETBhОl !RD00˷)+(qN pc[ _Eݒ,˛ՍٕKB%BX$.r%1B-yJLw0*I >P, 9hf(Z3Y2rT@ dZ:Y- )NXu$ZұdS"x*K/81}|P3OwZt{z rY*naW}|##T;kJ/J ”EW ىؔztko=CF\!)|yGz\1D:&AK4V.qm8w]k@{)M]Mf]t tw/gz_X$!;=Y`d/ȒDz33?Ŗd`lIMǯbtmLOH rxk/24Tͬ;-.leFC7 dC%^}PmϷLP󝒛,\v׷9 w\ʷTH˺wŰئ_1סY-ֻ>[/r Ak N+ٶtKdfMx<ȽkΆ zK+,l;5A}τeI\BS<i"Q2J*dmȍIkX3o'crQyBb|ymJxy(?n%_Q-[ƻPFl zcEG3^l&EjC / q) k*ɘ!UJB(BE2FE29"s>cBsD@y+)1p'xrV[_[Tb?[⮜7⊞x4*Tb"%"9ded S9P -[.LcO>:G/ yŠrmya-W %ފDy Ɯ"S+`T^FW\ǛFW?=̡CFx3|vB̤ępze\rQ_m:A1XWEǘݕ  mQe7\.d%TzX]ǝ}0@9\LQ'sN8U^GSLJs$=I 7dy<,8&S M񻱹/S 3%ekP9&HB7Д2eO9|1oSn$1v=q~٨{z1cc^,}'FLv3r-3$1u!}dB }ʐK M;㌗՘lOs,BeB5qv5q.u/ ( CCS+Bw˂ gʥU/eFwCʝvjz:=i3(_6ohKQtxƚ]pd޶w( 5~Ge>{8c6KAhLBk%smGP h5h $ ݣw=v< E{iv1y~Dਲ਼NFOI"'$QTՙ$2EA>hSQ0ӫ8SsMM;-DH%;K&8R *n 4S KeʐE*TR側ϚMgf#yՆnl$?Džm~ GqjZJ\Y(t=|l}U)DH2s׼/`^mL|岸Һ2؊{YF:$_⮟GpC@3L\3'n:M~dzM xqF_[5QK/^PbT֋+$3u0LӢEi|]]?'%Jz]Xx>}\HD=ݼ囻b 7\%)Iة5LIsB-j5{̊Uzۺ~]wa@i\ s> y4<;}{QD6=d/;wc]-Y-]t5#:]eyG1(ҋe&G=msx1ps(dgmnuq]vݫ&qn[Ҵ#aEU(UL ,'|0-ñYwaH:Ye /I?ߝo?P~?xwʅ9}׷~f?Q\@IZ>Y$OAݛXijoѴ4iΠ];|vy>>v[vsk;uͧoi(A]buY#5*Uzqlmh8yT/Bp[1!#]~cBPJ4FiMgMnţMO?Bb'm"4hT3( }_iǖ(NKySf`(YZ:MځQrt!V= &D r>J!X@_EeJ *hS,1+Ow)Q24ҕxŜnrL[Q86Z*&'IRIcD4@&ήƉ{w1}dMx$Zǟ7WEߘH0Jxŭ1{KħBK@C6ab42m 9=3$`b\`QdY̥hrS,3 \&َJ5,bc-Sd]fܱ ~vqfz4YƍF GlM6q"J't\J c` aBe2!%-@K9C툖jlGď'\. 쉋\p&w3FD|lZDQ 5P-?X9qx,xXM;E|ے3v@;G_qNYEC\Bяk+G?M@ʾQpbl+:f(#lsnsa]Q 9}(ձ:c uK=ΡK}Ӕ,,э,IL3& he%yP S(BJJYVj# T,p_M{M.pک߫;B7+}L'\>!czί6%s3:HbIjR5䣧/D1`lφX0LEIl>Tr9Ul@Y8% ų/3Eyf7ېOR~(c.4WO_a/_5(eDi *Cʒ2j+A`¢(yޝ00\f"?v_mW?ݳt=ʷtaQr/:ץqh 9^Z SXw-Gk\be{]—-uYiUke0z5x=ru>r :TY&˘0Ynɲx($@oEuQ :`D0F6oa~{=ތYcL0eГ?$y{lBR:Ⱦ='{BGp#O1u11ݥ+w“,]7Jq6fH)=1 Bܔ0Tcg@Oa@[QqcS558 Ic8 |4ĸYAЫ୍L1@%chMWJ8&z]@;}rnӫQl1ާhR8xƃk*Wɏ4#j ߦNnQ{qϞͿǓ"mZWKJWhtU9 tԯ; 綅bݙhWL'۞iӒ<Zy(o~~_lahHN6^ gBӫdUj$nv7᳹a A Z}FygsQyytLe)5d!F)c1LKJl:v$מ+H1HpjDx5BǿtYli̙>ngwl{P]FSk K0 V9dI0b@ ZZC02m :2x Mxёqn-`\bެ᷽}B }I1²+w6Fy,(ۜxV#~NPR:6Cn 0xm*{U~Xޡ\^/%IA҈袌$霉e XQI"ZjF^PGuDb;>~!fެ\?rfճ+|tl(|O s X@;ViG (E_ɏn&SW 食V1<޵q\ȗ@WǙuқIԽ<%^S$CR!Ғ4(yٳs_rI|T<÷RD#Rk$ΐ/Z$HY fg9u}-}6/ZT_d}ᡓ^CCo mȫeDV6,s<ڛOjqV.o|Q˫Dߙ]Nӹ}7zq9U=ٚvUw5H6c˓7W45#6ғy=oXP@OZ7 67T#5.&)Ȫ{J(ʑVB1 Qs!vT"XTJ. <H5\K+\]7܏jG#u oiYAjDkBBnz}> YkQZ!A*RϨ+)6%]3=Z.kd-tόF.?e^k]ׂ>X^Æpdp# (ͽQ}c5%hIFTGFud#$ $Y"xd+A)D' Tic5qDf-t1gx ZE\`"$[rFJH2?8s$8Zer7T$52J1Ajk%NR^rL<999Nx2RG}kϖqChMb[j{"C6leO>&A3X LP7aҠDj7^ȤGjJјLcO5FnYu˪[nETBhО Ku)j00cXbGbxBwUe>8oe|~<)XvIKB%BX$.r%1B-yJLI#v*I"+5"rlYS\K:+ۓ̈?R֭*-Uw}"UxE/xtOf l"L]t p;RxiL"7S<|yGz\1RuL*h$Q.qm* jU6F)D;MM.x뜡3CxJ@mrpk1Y#gOt._DI  aqdj;̈sG[@71*Щ# &=yqzq ?h)l9n;j2O|{?k9nbhZ:B10 \`JR*xLh@ !6 ' PδT)z% (9wBT Ec)3tG$wNMy6E$t儳es-x XK~$R{Ĕad?RTᲝ+90 )zHzv~|zѝxx@OyEAhmIs !! P`uBW91 [{E\hD)c$pPp6V;ȄQx;4C3 cϏ>k΋LI7x v_\ޯݷR!_=Znh/]MWꓳKPE4l(&d.gʪf:-v%C"$ubU`sC.k_72xddJ <(QE`@x 9"Zd(@V/vCq֎1bU?5T'#xȩ,KUe_|DFtOWC_:(QHA=rVA[#QDD ,Ǖ 8_0#Z02!6Rڭ boVY\(9QF{?.070ޣ6"rgѧ[jLqQ605 ӱO f7M.+)!63x6lVpB5*8E%v2&'gJrޟ4^tB$LW{|_'$:]L>ju~Þ Rw1^ƶ$U•}D^.Ƨ_ƐTS,(rۏFO$b^j}Iϊsd]U!9wfC\H/(R/[O 澉 96tMPܞTdr}~ÿ~|eۿ7xW`= ']$(` znoF5oyka)OX=KyGmAolm.?oS$Nק&j ^X l~1NޟbnR%`""D# q_ӥ7sH7ėlvCSlN5z@i'py\u^]?T[uH_њ O8=\,rw(h) Ex*uAL*8 >$k@UU:M=ȣĤ<$V3" GcP+F T^*:b{^a;dtj/>:x3js[=MgqV6CMY.m>u( ^ Bb,8 @FӐ"UB+- i/սh)_X:&(ahuT{,!7nMӈ Lx:.>бǔsEF׏/~@ɹ } ;:sqr 9 4/e@-^k+[(Qe64e)L\kC/̨ͨlҶEXyo1}}doWܗ7{VJ=pe5ZsX1AjԼ&VFM n-/\ZjBH-,:$Nu"tJ`U TM(JKb얌J1YXlgW²PVYxE3ˌ;Jl8`ڽ FKlt҄PBEi)C}b:DsP*tr=,6T^3A%6`d;{ Kbn4 k* RR[ڪJ`ftLG+V xwL5XbMFe)*eXmX^NIHeӂB4Ԣ= eM{FdϷ>`^;<3đIy-$[r,Y(Km3@XnbXI88ʜ# T'\eWmņzU1Yk.z0(X. R" 炇zǙGGD8/XdnF';ײO@zQ]JNpM%M:eǨAw^t,O[Rwo\:øxpH1F8P0Zm*ja8 QQ&"]iLe߃ Y,p_jM{M.bp5s'Q J[+S?dꓴI+j$m#?~s1O+> z1wAh:ߟ .~׋@J`?xzɭ7񗫢 7q4}EAEV̲͹Ygui(7unJxnzcGڔ-V=qNi)иeZ%Zk@DYC_" ל*N+!tpXLq"9ʔ'>|VPGO iXR˶!8 .:G]s*M|WTe:@+;Pt|wnZ6B\. s.=`Os-c洋zOV,=JT絨OXԃos%8U1X`ઘPXk͹YI+"!l`5L ;\+hp J*  W\UVsbW@3ـ U-*hθ=w*VD\BMHb\s \k-;\W6Įqb "k5;U_%\^Wd0r;* \W˹ټ9!U1x@yWv(pE֪cWʕFWhmwE9vU̅UءXi\2Juϳ~1un+7@~ŷ z1TGxޏ[p‡>ͺoъpPe'̔[H:V/Þ^4Ħ봬B3r ۽'T?T̮ q?]:PНb~[lO_~Pk?_NiNVb79h嶑mȃ]It YF1a\5yme ʩ];vFr\:Ztk#5s۲+3뤔Ϊκr21}PLqՔ囲|SoMY)7e,F"D'PUjnyMI57&ܤTsjn4ViR͍n6&ܤTsjnRMI57&|.e(sd9O3WXLimz# 9rozŕ8~>:4B X,WH0:c:Ou^%-sHjz&ܩMROeЌ;9yJfgfȼ&GFaV9i 灁2"O/xfL1@FE똄W^ԪfmXӺCd"Q|xAߕ$m"ۣ<{S1(ɱ9sq8|eUmvkjhƌaFG[{ b#޵C=Uci^r4*uOW#6gu8/ťW-]o߸{V-Ѭ|{z\x2.fT̘_iFʼnO{{率ljyka4:Zm&Vlp{Z<^x|t}=K"_+)Cv{mLkɶ11-!1Fyݗ_|2y۝atS MK[P C8 Ecl4Q$ʬw6`QGy0[T!F)cqLK6P#! Ev][]͜P I}y#ЅqBPc\ge3[p/< |T| &lh^=`M;LEAR&*2YQrU6}x~!Bk,vͽy6Dߣ7H좡Uc >\JC%@5)H ѡV]Yb13q,˸BIDS[,N,ț7Ik'}׷Ϸ:n0\֮KzV3|z'zaVz=ο&)ueb2 J$@C|OI&RT,% !$# A&8֙sxxv"(3(' 1Dy>r 6p %>tʢ`)*+C^_@LpfZZe.o'aq>9MS;X5^p6XoeI,IL'J $nb?ˈLRśiÍGVHhmQGYKaa6~X 놩QQ*-A:v.\ҀY?mBD"Ԉ JizձB @ceul˼-2"AP&!˲L6.8keB3ϴO1p%y4NlN e!f freaď+;yv9/ QeM=_d8dS"IɊk FkT~YHY:`9frĜUru% D ,g͞l099t4r`f#Ad< w;ZT~>Ecn+gy"h&ue}&Q7`&3kdPeyt\ag5!;C3ͷlF풍h䅖 q$`As/D:$feNhlBJ@[6r?x/WE77Y=x`]?{8s Ij|P^01"pU6^YEXɫR,).mNZe;-<-].X9 fsL۾{sN-g:ts]Yzu7']*~4~1X`FE[` ]X)X1{}\t[i{!Ω0w ُ[k @&iDw eD&Bvϣ3-|)ZKwȵ<1IN>Kㄮ\J 0xmJ> ՙY,zP}jH[$f}+ͥIɣW.f?Vį]ၷS]fߧ+ƫ,b}ыYxGof3hW[M}sb]w߶-AC.. tGݼMym+mun{?<_8+;者[ %&(e+ o#ɴЊYج=!gܻlXMH's QBA)Ǜٲvġ~ztƻ@FlNǨR"X%0Lg42eTiL4D8 " 0 Rs G1Cc(!G" @HX)p?'0gtϻ͡Ҥ_ r;p ڀ8FO(K$9ˊL4B -Я\z݀'FiҐ' )\ 1UB6漰#SIoE"qVLy9#DlSkuP螉N)],(yK1,L8SN3.Iqۨ6mobØݕ  =fԓHQm?n2gw#+ ^Џ \bf68Asq&u>:ń{--&o0^w6("J{\,qNn%|7͍p}GHlx1SꌗG6a1A&=xo'k #tjl"GxغICfèL=/y7wEu!G<%+tQ5~/|E*  LYU6i᎜{f# +⑬FbO絮 kV UXxsW/{}1Fh!uA-(,l}${d6{<&ki#ŗ^U6N;3w4~PMl¬mct"nw>WUtW2t\|Kkղ?ΣRϽ)ٷ>;-mqW+X^;?;>,Sam IH.ř CwDS  W6&Q?D HG43IWy b\pܭG j1rx<+ޢZG7킕ZO6RX@+|"DE8rFk ]38HHq pV)"C!DhƠSbɑR6!HFb܏tbXX36BU eI3ρ\d͸ٜ\j WO~sAeet<[(4!ds!:LT1i7$%\~: c@u."c6l`$g6T^3A%6`dh;{ #v1rGl7sg4Rܱ)j¨:4;K$j9' ãT9Cɢ&NܲHI6lm$iA!-VQ!Ԓ$L{#b.$G:*[m1rڨeYd`D,CL> (#R06)D%K J)rJ뜐 Zt("M d $Qʂ2θ༆(G=Z gCaD,FxyU;,||ŸdC\&CEbwquydpAIJqqIAHDx)-F&4`lŇ]bܱ#xi@Xv" M+ ~|GhɊg?D#Q,^&=<%T*61)iOz|@]N ܐL Ds~NXAU3=Ofkd%-x$Fk' !if$1#(c.K̈́-[HШB*jcAwxpV[8Q4!6҄v{(#gl9Lf_7ҳP!Sr+\ZqËok5;^SC*ZOΛ/[^.[UYe!7͵pizu<c9>br2=jQ9[A %Ĝ/?=>i斛]p! ~@>8~ub!6ѭ#X=Y0u|,2{p,|8_&z|=䎇ݣmԶgUH3 d G>bz#~@io)nXs}S~8 ϙةZ^N/9  φ<8 n{ZVW8{f<[Q2IGUk9%rvs^4S{?Msx@T6" hk$s$tzrPr43|mXcEPz571={:`fg{M+~gr1}^ſh潄koQ4rc 9.ߍ_= #D?Br)ZFw2Zvp• Ӣpk \erx.ppԺ ʙbz;qfKD bANI: $>-㏑1f!jNӖ1]9&H)7V'h&˼T.Qõ4ɺ[&PV; Tx㩎F&%'9b y*ݥ9k7 0}\xNm3v/?vl2UYC\V`{Q'2! 'ZUT Bt@6[!LGSN$iQ x1q]6 &eFjDzM #g͞#g}u>~Fq>8~q뇽,ZjKZPًIsBcz x:`bA<,rNi`*!Eph̑Bh-*PFFC4R9R`ښK3J1&*yb Kp1rz7 ǣpښ=[wx[lƼî,',Ɋ[-%@E6P`BU |&*Kl& TQP)*j\#mٖm%Q e B{NZHђ4ES>S8lζl?嬊a~-/V+8e N4I?yI(DKSEw^:FeYB3OI;i$1ΡU%In{sks0oءoqycLܽ|N;*zlzvKg>)fg}<}> X9J/*TEW ىؔ2,ӝzȍ8ٻ6n$e \W^'9.݋jĥSbL4I^~ ErDiS%40@E#匓F-Uȱ$DAF YZ AHRVa S띱Vc3k1hFs+%"Plm7^t~aC3IgJ=w4ZT|y a_\غ>\[N]ƳOauIgH4%IS Ȁ]L2^s!h]@ru1o{XtK6uAn^ͲChY@ˢnZޯzk*EK-7Cl~!nmxtLo{Vo鸺w[IvK.õf=vMTR*wmS{'?:1P^3/inn4DR1XjENS8:M}6JPSZ ӕ8a\5Gb$*τS#Tʶf['VI ݝ!CLr7E?@Y|6~vėI_+~5< ~'XHE8m< {$X2"haOtpDH\q 0<0()QBP`S ;ˉr!= c0:ϜL9C1!:"7+4I].Sw v@fxH{ (R1(/rF(`;$4jbjeGh'e1 pq;z6}Ѣa϶?rIFԃܚ]n'viOd&y) \5 xP}0"0n3>x.Dcq)JOmtKl%8Ӷd|CG>̚!wK޹ ahr&hl4Mn/XWe$w!MB޻hpy&<}M?sC2w&ߏ{bMRnRWfsL) R:ieyMYN@p=dM󝻗m99HHvی}w2k1Ƣ8cV m`j7|n_Dq 1c"jx`:aZZtm|0|:g*'s28&"r9cWs3ʒf "bDz$Hqr sN68@$+@(IzzYrast>͎aL)E)Z18azFc3#-Ձ8bf7=/6oܳ¢Vved(=V}I' +Wi1/v' {{[pIiV[ [/(~X^̛zoMGv/N_G ?=uo&a?rx?G<07QB EI@(+-pvD<~e苗K2>H(h)ۡUm'<"/&<+5T e (4z_W~Q޸| ߙL9zoson4^Aە KyQzdAx2Gg i1.u'|IH]G+mSiqkؐ,=um-mx-z47p.l!7!E)W&lEJjmL"Tz{M1l3Ѹ-U$0+ZWY^G*#R"%1d@h*hƝEKF$-zGACcPa1fUOYۅ3,dPo\!P;Oʈa>=Ac'auGcM. D z02"ͱb^z$hhC{]2o]l{w6ַ67gمKΒ4)MM+oRIރ6*E2zϥ*0 $beCj4Nl0,:4IiꊰWN< YGE}oEUN=v(ɖF"r6FgRPޝ4\t x}*xD5 /-,.'йKuI黋ixy( g7(\Lo> molJ V VU[_D/_Ϯcd@gU+3$t1Ixg!arq'J޽9U{V~-u͟-]UYø b͗0Wnm Ej_o/\bqa!4V#1~aH02Yfw}J{?n녞ލٿ6LN8*AG'4j\%s74ݔz9OK]Lb/]3ͯRWplP|ݩygqW/.~|^_x Lśx5:Y Mo{|5Zƫm#sՂӜSn݉]*}8[ Dޯ7_^KN]FyY],NMЊIJp==_7Z =WuQR*U\;s׾2n@0|ϛpt_xXh&mg4U-S(C픝&0%ja `_/Ew1][) *0w zd3e4([\S$Pa2TL H̴ByƱ),%Țc)um(xN2`s~tRs#bF$ >BLXxZp#ZB#P1N*%Q9f㉝Yaά|suFЕI'52`4 S\ksiǡx cyV(c!R }ef>23U҃(-ˮ"F$h%$5+aJ!:iEYRp%N OPUVѱ+ "wG"\PjhΣ4%60q6 ~dx6o.$a; jW>|0&I~T)AHT.FF:.VL@,y1}d Fq$AًR+'U&+*oAx ϱV TȈNi,h[q2JFHqbrN2|Jl0DȘȘOVɆ8cW,PtXxDkY$7D,o>`ڼ FGl.ULyA@2)y(%р%:(rx@J'U42XOPJlR!`V0/2ntg'UfFvĶ_R1FʌڲCNS2klJ[9J`h@QAOԐ4 $oȅ4*YJkfE;$`)I@ 88{@ҘxxX*uǂXQu!F\k'8q뤓䙀d0 $XCH+J.fED0BB21u˼d$&:łKsI&9b]nllGW^\酏֙Kv*Wf>\.v㍵(G|tV8 |D aӊ@؟i1px,x;wE|ے3^<#[%8kΉ9w([8uޏOC(cRx`& UG/aN(E f-@Eqck%N> [3E$W$ c0[r / 88b^b%r4"PP DEdQ0 mU.k%Od=s;FvqƸ)mxU]&_fD`UԬNBrS DA5:ƂY E)8Jl'<(ʏ-c;PN!uT[¹c*v`"b4}q3:)0]Cc( ;EmG>|:ؐ6[R1HĔLIy݀8iMN9}M%71"i(״0G Szd܆T[ܧ@Tbb+Ɠї,\ nrx0bQG"`"RSFDD JEV 1F!wKlhlM <[$V^|~.S1.\uC^+k)L@yMRfMTQ͍QmQ'(#ɉsե~NyFaA3ZLL %ثg&*)g>k 0?JR~*p*QqW \^!BOW"3p0r9{ZzOUaT#M.W]_=&\qBpd ȕH \%j U!\>!:Jb|*p=~JTR3+JwmI4Qa'316ɗA?ǜȤFۃ{H&X"/yۧI@ `ա+XE|銬y ]Npɧ+XS8]՟?z++o d/X]-Ʀ(kk"_2==#eWy* RM:+K>]cI!ԭ~sv]$o{֜Fdq1?]李Դ%ZaUnt:hH߰mΎ^G]ZlѪ5G-B]'=Ur b\ l1We`>ABxXԼZ@ao̮}V[:sE9~[sCIxj,]ilw!IWdo<@zFu$F|u}b _:Гo6;:L]kS="Dߢig9&U}㳕IXmJYm v?죢öehZԘ e*ZTvBYe}ʹR"Ct/Ka*FZ);a!ڕZo]Ts%i,ߴ?s"K598D ZkhB1k-\t7͛ P,nmYiR odiu͛h4 cĘDhi{t Y ftcWID30fEFzyyz)b/9%|^{ D4HɤrV,G@[G҆^6'+QѤ*Cd] #0[L.\:T{}2:[D{C!}tB$bn@QD&.NSi7YRXGQ%cNօM,7C6v/Ss.=o iTE8USI{k%h(%%Ɋ] sHr=ʜ%~n:ƾU21avx7Eע$S&mI!a+D'*|!4dmt}c;Ʊ{xNQbѥ`D)%pMNԵHF>{(.Ac 0g](!S׆RDh 2Q[ݫ N6` 8 uԁnÈ#Xe~0@&!Vˮ+q$hy/$t 9vTl^X:?*j@J4TU;+Qya&k) H6VE`Í?d]@,M$C.2The1h6F2= RnbTnB%| \::$ճR"Pc*H "6 O4&`mL)<(#hqlyZ| 1NsyOټ.v@lmOs9EÇO@]Р$OA!%ɯ?Z>_vǏhJJcc = Q?-`)e^ˣk__ߥ/4esBn6zV9R2\-f$̊k_QrV}n,%oV2.lC}}rQ4^o/[%̯quHxў|LrZnâ=)^m? dKP=M*.s9TSMu\aM$u~L;slཬ~ sa[)"'J:,QK) ׼8Z<=;_?Ka5z[L/^}ui<)K\dJڲͯ8ykgOBPxwNe=3`QfIk6&"CyA^=Q=-5I9w i L[}9\MD7Q(rySV&nE#ȏ_D}78ZCofBPޮ27^-xlZAg@xG;r vk]8YTo6( yhtC8)hܴTU']7jiWNˇDZG\ͻP,h˞^XM↿<-wa~s@d3Izu(dm|R |iFgjuX>T[Oi_̬;Z'R#=RLibڪUKafmh.uWB7}}s2>i 2㿔]3 Mz!$ex@h)P8&VdG)V0JEHb_.&Trkz/_~~;=|q՘o^m R/juծۦg7|fgW;hۃw//}q?ȐKR|0#<|8-IMóIk\FJ%F)'@A(ed3dݣ}z1z(޸.Gy2ڱئlו2pfV/nZ!ʣxLWpd׾ަ՜ڕ+j[+H}wx!"u.\h4zޕ_|vpH띐!rXlLlvqJ ÙSzXP" |XU {K1(Ah4#3IUuC88Pd8#ЅqJ@2{e᭽G3q3kR蕗@bJKiIDʱ!:X[.8jg2,2y탸iݳ~)67s%o zuChF&d6.KQ5N#-Ai{sST9ɬI\F,J=S%&7UQ.ޡ\gHKdޥ u)k۳JI*&9&ʲ|ђ9:57פ&1[{,כ^ĹP%D03na8­F5q˜?<.kݜ1N}XS쏥뽅M/'l+K3ז!&F<7JqT#qGy@lHQ[·tBHt)ִ>0!4ٻiBφMхE,z'HE$u ҄Yk=ׂx GU3(K, Fp1f[JD!*e s *Ji6,KP s)n τe,d#ݫzOV#ũQi~}Y/@ܭ6KZk^eͼ]ze@33"I 5L Fȭ6:~YH,"*` LD Is=pHR(:H^vA[+FMT?0<Էyvk*'Ss;1K[Vg=.mG@j@ M x28Zј`-ϔ[J|.H^`F%NL{V" 2'|BJΔā[ܲ>ߵryojw.XK0h&iM`P0 `L8Coh HʰdC{= TwNI0ג;2q;N͌3s\79KKX-w}'jT{u4oi@YӀ QLa†)nI< t`g.p%eL .@sHKu)knEIB6` jmJY8qRD%+b̂e6s#%xƢ0Am5qd3k>BI{7 ]wcgAl-ϝ̖oo\?UcOwM6M@2*|xדy'mxHrh2/5PKnĠC. jPvuӵ==lO6w?ͶCjPfws/yn}(-Hx!:OuoLE{:^Yt=Fyv\tǮTGzM`y> V1M: kf$E" f#МsBʝ yQ@Bs&5qΖg=`HYVh̀3α3)q6~rp2ڬvgnB9{( 'ԫ@Qmfw\F$.d%TÈ hcta&x'a\` *hH9d4RD&>in'4.r}!|h{SЇCm~wߣ_wuiurZvw*n1~skfhf8Bk~+=uҷ[|j%%_NV3Ǯܾ*Oޝh<}~H",[]d ?B 'sGŠyye%Z1i\vR 4W@ ̧P4*}:&9%E1/gG\==n#]gLF;ԗiotYi{r/.Ҫ+;Z brq{ӿfb\Ҕz>'Ay/[.˲webi|$2-Oy03'a;\7^{K_K}.`gy"ТEZd{/S{A9XuFm =dI_|#~`WN]?Q5=ޏuB؅nXfJXޔ1AjƬKJq0 !cغeF$:Ы+SY)B/`xN_;n|7ˑ)qDUpm)2qU gYD*GE j3J:+M3Kǟ~|tcg`A3+³Us郡2TqB2N\5bWsF-J̚oǦB 8*EQ* h@VLP׬yAj^,~ϢQԯAH.s&wJ@ Ѭw &_Gwں\ k p)eP"[aO,[õ5LĿNqP!Qb (fSB>L܎9Iyq`]e]M'F1Ҏcl`Z %ӠLXvPi K,A1EMODN2c8/]X4We2:E,ٚH(Ɉ$\0F1d*j췇-_FYx.~EtE,`II!0m"3  9E\ H ]jL)$,V\ F N% 2I e%X*[j췈_oj$8[Əi`wi< bDZg>>[+mU"@ f1w xR!r>SsմL'0a^mu !{W=q95_ z2 CP S!Zx@'f9$HNHP/6&c gx$?8<̭ࣂDp@NxCסP+.bLQYR9Gh R;CJ\DBpPJL$-Y|$3XWh͕MdʾҳVHrZZ&OwaǣoŎ˥uIy-=-.: 94ch6q4 $~*J#57y)yIkN4a4+/ *5|2]1,wm_!e7@0 ,Nsɾ N_b}J\QLR߷z"y X8_U)TD2G* fi%1#AyГ7_< ʏ{(k{YաuO -6mTy"Γ2"%y3/>=AyzhѴKi IԠcj 8H"ͱb^z$hhXJ=eYpJ8Ċq~ ? 5gq<*{2ߞ$@KVd s^>0 F160knRAa73nlr䶧%4#dy4|Lv dB#Lޡw]3['YW:\t!,_ZX.]Jy\B<9^H]U0 &y>,>LOÔc:9ԝ߄ΝɁ|Kf/r$z|r{w>l TjC.$\"8ө{8+StqF&?tT͓k]u91K1%| bwu][gvP[m}k3~\0FҪv$Wt[7 vч0(xEjm㇃Cdz1{ɚ BkGM=׍ZMnԺJ g68 B#yL>3<+vEwqgX8)XP,w*`O^t;ww /߿!|~w?D]テS/&Э#AҝIu'܁ESCxTCu·&\32lOڞ9QإŇך{S^\|W_~sm9y|\RNڜ*. ܭ!D "fARp#Ąu0-"3%oP8*7^c{Yg;f'uEy^xh)k [&JsVe2/p$K>t~ n8K}7JuZ,yqVS&g;3~U +?a'1/N?7TlߍZ^*g7*% EN@tlp99`sSgߖ1T#0ㇷ>`}hIrb?Z,b?J"KŖOPeb?w DT'W\NZUBH1O`+P+UBb3tJU"X蓁D§W@-*QJW_%\qLG'W@F#]%r<JR~p䨅sa~k ΝӶ9g)3,<];}\Q֐3ɰ %Befqa s4WeOaz3w,@+0-"zH-c鴁!A[ΩgwnՒꝝ[?)A sIΊzֽ]..wV싺=m67-99_rf"ZLSIE+o]uW6]$6eJc1=J#xSA ƙARL+>~hUY,wY(Dcl`:1sFF3.RuGV ki fck U cJ#" 0@SKTknXe.K9/># +tk gϵ#쬦 (K?3#eҝc{p@+O ;]2"`vS][zrJioN%Uz-Uf)&y\H3\h`%vUTm aЙ@tb~<AY6ỏ-}6nσWoA[xP|Vf(5\Xga5Aښ;hö#i V/&i-,OO??6SHC3KFDZ3aP0J;Džd1OaZ(-aFhD[7#Zɵ&`"{&CD0E4^JJdTYXZYh64ljf(J`~|Z-#*]rLpOC]j|oјWԔ+Y:00+#頱!*NXǹқr ډ{IaeF^鸜Mn=_lH&ڂL RzBmC7zٚ Kkq2w@.'s!QK>Z8Qx~ "~9srl&GkyUNuft /DJyå [3jg>O*\^1bIo]_͸w+z ՟I yt`/n jLϳm+_ۼ* jL)y[YQ=ip3N(߀0&¥0op$su8fmC43 s+_FCbʔ_-o 3[\|c$CҼ1B bi!$`LQQhu!NHJhl. hr *sVA-xf֡E1Bs46h- \~.i WBP㮯߷o᳿V Wdլ3jEf煘V^=Ӈɯ%0MV}GeHOD y'q?&s"kV:HW"խq0e,ģLdYAJU@^g?8>ʔ"·n2Z2a(g@EHEŰT8шHd;LLPܨe0bQG"`"RSFDD bk+CʘtÖٲ ݍ_^I3wާiݜ ZSZ8uf.tyVCiEzAE 3q RT#8A{+_|$Xe i`1-Xhm P3hAɋR*:d+Wǣpk̞,V5V"1fy RH9Wt)Ͳ&}b1i/ATb.M\FqefÂg:iQp!Zݲ-[HuKʃr'TyMRk4H$i \i 1j[ݲ-k^_VY cxX~26;QG~?adG/ T gaC11GFVDeDY Z@A`~J]1HY)3g*s e|em:Mވ?Rm55Β+-z U*i^ѢuOJsـfꬥ9xיfa V3m5ӣz@rQ9".a(g$65h*D%8 Rt6RelT!D!e!x0k5f1/豉hj4BZ"j#gKʹWm2zVKnlo`F1y^L[ժH Ϸp֢Y"F[Duy oninA?~峭\K8]D$;]^oԸ d_W_Zg:jws0z]߳f!̠eV]zz5ygZqY_t}@lyY7o,;qw{:akVӯnXWO*2kOď_j;萻4V7?nh_~|g.آ_8:M}+b6_KYN4o> イ4\8#aTy&ttP5c:Mi}/3c2Q(*eim nR[/?'Wߟ^s MJ k"xB6r=Q , Q-""\nZ`80()QBP`S ;ˉr!= c0Ȝ"scBuD97%4I]oa ^js&%se9xH{ )9#sNDj" ZO33M}+|BO< yxǙ@b8d,s$(VcR a5s"OSȰi_=HtAȍCp9N-s"#@@5>Ș6P 9aGh8Fcaef]-jxv[ ! CzPԴf׳ B|'!y O \6Zqg8܋} !FGim90touZL*J8gtBwm\YAJzb83 :q7ô!Ub"{EEKѪJAdUwv޽%]T')S+Nl|yAKI_EA(4w/}|5K7dm'l0;a|q3'wX+凳0'[Z|'=8=xr&ӓT 4E5ufli7:{}FKAȺYےtͬ.wc+Qē|<^/zzb0{s񵭢C.:u})zH=Rya< +h?ǴMCW*X?l0_._ӷ{u髷N)3}WpF` h7`>k^yV54hxuJS|v)ok|m/ۭAX}էWf0 %qxZN7&hG?)o|B\+TR 2n@cNu4u/ƾPd2IHb5c QQ K\y 0P= 2QHUN+Y e&TJݗ,hl9h?6gm%gJ*<̣)ҩpXڹePklM{G5|>Zo|Nz+@ߪ<<3x_xq恽к%>`zkuq8ܰϴᏑjOTA%d"ИK YHϴm#X{*2)L K(͝&XZ)E#ހb2P ^66>{5j}ݦQl0(T @.GB$X|zX9I-(e BKl&;Mh}*نU~ɱMAA Fm uMmnukjIcU:}NջiAS'9xI1~,aLs h^ X 64\Fc#&fNPO & :d.Ep:EE(e 183csJc\ؕ u\z.p~7n86BA@:)G9AD*8gRD5S6KJfNnx0 gH\頄W"jfΠc36 r9h0c7qQ\Q1[#6:vem0k략{ 1D:iy`@Nف MFc9iF-V^ G|g%Ȗxa{{\7o\:øKC{ݳr%G 2JO6A53QAܬi_2Υì) ,"+! Z2^Lct<.0e6UIKv.zW iJ?wJE'}&+qTM>q4=&3`Njm@Ɋ[8?xmΥ𔔆YWJ`(ה^yQ\-c;PNX 6Sn9GDZk& $D3Ǣ/AJ2Q/6"CMbbP*{tX/-oÇ?a,JʼGE'1`ŋˌѥ LAqӨ$]NĬ2Nr`0e$^<^{27'sVx9. e79  yTJpBf22rT*ቚHT ƹLZyې䛡.$#亩| w$k~qnNqQ%Un(Mh\(GuN.&eY [1# p+Dp}R𠢷:I:[.5'@۫0O&R67l5: ڕlnVʶgsCJܞa67ˌԤCt+klW jNW鞮2N9ߴxzj;>1]mײDrCxlAW^8.yy6Fa]l)^ysT+t@Wi_坡+D+M Qj3+CCtuwAk;cjgIW(DW(Ю]!J+_ ]IKrL\=5]m >qZ'xjK]ҕ$ޞmEn%`n;(pp;%Rw۵L=G~؉(Dyȇ9}LO)+]!`++E,th-OCFtutŨTEDWX3 ]!Ut(3UL 6~3.F]!ZygQ*ҕ`Z]`&H4tp%V7Diz:FLiA#+Q ]!Z3xt8&P;Q BRw3P7HW bb]`ћAt,thmCg#+!`Oh .BNW,2G9//ǣrXi!!?.Њ3^AC.zq:O 43uLC2䫟G߯C';'ot0>gy9}z1'n~m1Ta {UnJ̅TQAj-S+,I-As+dzwߞN*i02'(̠I'a6@920VV?se>aaei%$׏qtLa@ .s88 V(36}\eT*Ô"%_~$T^&o#p viU^vgkUUrn0]競<'!N牟F8G,(\7ňws8qt5D:3Γ__5jz#K1ن0d҄pߥ(S[k|)'j"ϾiNf uz%~JD/|$z 1:͹9k{cdho5k֕ekg}ee۰ʵ֗-&P]7,ag\ly,.6uFL.LX+UDtdX.&"h9:]!v=]]qɫFCWWX Zuk;t,t%FDWذh Z ]'?t(EOWHWhyS ɮ'vh<]!Jz:BRQI#+" #\mb+@w%=]!]i%wվ\Ec+@+fQ>~te4'&&u).WWRtuteDDWXBVBWEADǮ^]gDf`}fP3o}`]5Ci;Xb=]kzjXB&BVWZ)NW҈Ȉ +E+Y,th-:]JMHOWGHWHDt-.-<]!ʮ-YJXDL+ .#wJZͣ+,.Ut(m #]i 3x]!\ ]iB ]+DI{uuteQĮD Q~!1ҕ%X]`+YȀp%f]+D/dxAtkb 6 wdhWX]5Ck 6Ci;FW]6ж v&50Dh΂h;J !R&:euS񿑕eXFF֕%D;~,KɇRƩ;7kL4.6x>t]l@TbͨTEDWXh j ]ZId Q2VE>.b  ]!Cse疇t,t%LFDWDj <Bw%g=]!]I&-iy+!Ѩ+DY QVVtuZs Pڎ9]75,x/.:|j/p?sQXȒLhӆ)NFx&2dB:εa $LDC[56C{3'T'#$Y& 4"BFCW%h4!ÏE QvD8LTFDWXh誄 ]!ZNW;ޅ=FKS +3p}*R3:Jz:BRIBFECW9ص%B Q数pIFEMg-f /^]\\Iz-NI >pL@31DdXb ς ~3(YE^̨d`*=0=e>.FUmBZ?:]|]myi ;d3qL;)|j4psatFYvƿë=]bx-u>ӺBվ)$,甍:0ީv~2:X?oqpĶA-? rw0oGjw>5ʂ|eE&RNˢzޕ=#n_39B)w#@Fn8,ϓ唙%駐]T(t5:&)(dP/kƟFUWbmQetllwdM[TՍ XlۗQ5U&U^UڪLd+)-) m^O=kAlyeRO3\(a~ V^laÆas,8a-/tDJd ^Ɍ;If, i5 E(ʽ:S }Cך3LI:RsV8A]ZC0pt|^Dit]#8_Je&;M|ѣlK|ZA5`Enn8Q^*˥̵.xorT}hcE?x[KBBڔٳݾ!֓7T{É!ie}΀{SyQQ+r;U!ZZr.sVʀ\^giEKcĒFRӬA@a;>/~t^6[U՘{w 'uW.o|嬴MD߂9e*uRThYNTfp Ջ-b-]˔8s`w-Mi Ks"Nufc0pjl`Rn)tQdt±\鬠q-\nLrdScrpMF<ͩ V4c\z /> %/Z ƺ/r UQٍ0??L݋S;X_,`5PN20R.jjǛJM0-FkSpE}/gfjvS,lUzPz M"<Kb<2繈ۮ(%?=srL[zVzXN)?6|l f AM fLT`mL>تs>c^&ç^T}Lm~LI&bˆ,/ &B=w:WΘBfgk5˸W B"Wp- |yfsgx޼s-ZpLE)LR zj| kyf'U 3pV~/3~9oM<>>~ap|+;Љ 1*s{B© R6ge?{Ƒ_&qs\a)L4u[=4$%$c]SU]U]9QIcR a5@ss,'ƍޟmx>5 G‚9dzz˜#`(2&jJ!# q>~S1<>_{ E+0 ; E5 tL 3K9x"!'WW8܋} !FGFQI rN=xW<أAgΙ~]T8 |(ŤN(/&iZ55MFCw#K dwh}gl8_7z 8~<4ôfigcM*˺Mj7;-ڌYzTl~QUHQ|{c}']`7נg5ց!VW#0cSBF@ oϒOY3P !3dR΂?mIЇqa^@c1WNbfkuiyT>!W )XNs3gl?YP+Y]+"d\%yͭX5B* zMCQ%(̢0ާ7VQ򭆳q5OrDnSahry\;toע%n[vqG+'1͋fZ?D}ui/w-~43Tc<"H{\=.Ly~9_ p1`hpw 4 ɰF->:twyíElkMɍK?m٠n^9 1MԼ")#seVo&35 ^s_{zڱ:aJI_m FT! L*lqH@ 9D4챕ނv]'',Å7x>z8.ܷRTG,׊yͬ'm$8= k9Ray2=;% t񬜇rz TjΆ hozӍi8.CRf W=LRpӞԧsxt x2F%֕qWZH ݱ)^"y̑>Ha({,ٸ KwzwSQBHRRBLK43 1c51D&RdB %%c88/ LD+T4h`j4!1H}Yfg2 -&z4Xlv0Еp*k,gm|U~ۇOMnmyy@P\ 9B1i;4%"͹\*¯se$Q #il{tQ(0:˂WѸhŒdToH)ňXuP͹9dVd DX1iDFꃔQ\1 A*!llFLgٓ*#"3|Ouk݂hi`Fr&#oTR3q.b%H;*&)eT`$5 {Io|D)j'oc@"JC -e@ )sn0 /q/jebI"u+lh<8-Lc$Ws. *Xb#gYQ]eÉ{]9V$9fRrB "h t4qF1A487nؖP&_xc!l(r_ⓙg9.pOD-AP@ l^cla{#(,،J7^I. /uԡE#rVgF;.ee(NJHn'CBl4&q؃5Cs6-*םl's\̦1E{Ɂ2@JF/tJAN`I5.NpT*qlMvQ7nYnۡk]8_⼩O~~}^|XC7pđCJQR$"bVjH([@ 42oӘGPČAc,6@(etE刊`) NPD} |l2Y.W_{Ct-zEY>fI$*6 Ha Gtˀ""$K/u EDCe[ꖔɥ WX`&5aT~H HW8^uLVϺLG~x% ^ u Q qYZe-hUʡRh9ɱl.71+2m[ztWoE#h?Og FDi%C\["%21.dfkG{Yg9(bo3N]T"ǒDV:2$o" AHRVa S띱Vc@ZFL&ڔYYs+Rxzf68{j5oR+.+:]"EgSf۷!_kcyU ՘IDnC7COjhY`ȭՕt$AhwT{ .&o{A]ֽ3֯دBnިsoׇݶ7FꜣK;Z..k[w׻h~uH 5w҉ڪD7P]~u($ȯc;ܡf^ztsN%urrfBwcV:M,1ڕN(B B Įnjzj?l̑Z'18xTEi8,*ba͘Q^}L Mo$\.%>y"BF )qTy"ΓUeDJp0 oIn գUo?hڇ ,Sj}#S^dD E@tbͣS􄔗K0-AQќW i#xDC64# {^ø[?bbWY\0:JkRx/\pkS hsɔ,"gaQLL"f!YNR|X!:%bO ߿L6yGɥmlOWcECL&`@2N!Ləw'f>vty2_8 #]6mOrR3 ǗK.L!CwZـg xX|X'ge| T%W_x'zrbd'W/ SZ,|ZYqGH^R0wQwOfF2^^/̓7-0^ǟT\/q 뽥jo Ej_*Ǜs3^m(_1fҪs&ok9 i֑=Iţ`.|2]=sx1\lY sV.k֮Jjy-9OKC/b,|I}ӱAAԃ uss@ǿO߾w߾?Dw "< IPnL_o~j-rM SSvO˹fӜK&^[)C:[+Cq8w ͅYKSa·/*m9f 6B\10w+@ e~nREz:!)؃z q+JB@S3%&ނ-1 RDjxH !e(=šyWiec;B2@s(tRs#bF$ !&,Gwo iRb'QȆ_I!\<NKU0 US4mi"$cI؇Mf>9流>Ez+o&寭QڮX]ek._6-gq;AbZ5gx^3ߗ0AןځzY_BŦu}R6@&ɐ_T!Ì*z嗏"%G؂JHJ R_ɘg>$@#Qh$%DI Q! n0mԧ\Qf(1҂xRDO1%(x!#H?8}z>:"y0 y^QJEl`\['R" .N[m$΁!I( jW?|0HB9UaL٣٬Z4(`ʆ'2S]J7SSvH|?eY *6O9JQ!2@ ]r[r@`cpt%mӔ,өʿAR2iEB")򤽹 lt7j`6ǐ;W/Q)ylzj#s#x"J!Db#2v㖱Vf Iơuvg4_2RnyqϜُY^5@x/oq26_FK 6*cLJhS6&?TI*4[ Y0K:ourmd/-0Mc],CqN@Bnsv6-v5y6tj}gvgRՒehJ,C1`Ӕs1IkeAкG̐*`Mulk $2p-v\TLۻtpA 6x.|E"3ت-ThԵ@*ȑ$Z )SKV;Kft"W *e-+45;JV9S}̤58Sm8[o*/ub8+wNnRr].a]|$k9eEH |9g\]>d3CBQBFg?Xv9nq&aLXNJq *K ' vtǻEzwu3oB0RoLCcq!+F@IĿ }| q5A1|QaGϬ|QF)=zdj"&.W V!6cZ$"e@Y<Ujg>N/n;pZsjp3nvN\X2o(߾ mE?n &8ωgk?Lg *LZ.o!kFP ^(`Ami@ k:^X^%/F)ZCϪ\|KPA:[M'+#Dkȴ inlGh/>h7|| y%;wVThUe`0"a+N WE&U|&gcztK[֑8BGp#qd8 Ǒ8q?FT@?2#qd8 Ǒ82Gp#"(>- re~Nq+E>dcuFh m/d_:ɾXh J1 d'UEvl h ^erVh1b(y6QW)FRYB*>PyDL4-'f+}1%C&oK\:&z7Q5}p(;"Fn|aL-w`ըEʷOt8G7ǁNT) ~<2#&~ cѳ9tmCY=|u >gh۠Ҷ_q5w`-<) uʛo͜k;7J]z%/ mK\M?NM4jc8a{˼~W!9M_8/y=yX>z&S[<-%Vjodr2@pZ*g}ŦBR)Xj&@z] teP5['JWcjHZʠbʎq:>p'͆ &G&S|mWY"lxkmUZߚ-_rw,vG7>JrKPO8[ZIK@s^yZDG*Z:YV1$C Csi|ϫgl%S̔XâWXPNEbZUH21k%UϢh6XNjZ_ ؜5꼵g݆-3R3%]ʗ_BQV% 9-CF)IceDm(Cz"+C>32)%N5k"Xrj, ҡE`g=`I jdW^K*RXspκ!vUiAAW}M:$; G1/7GѢ꾈LlˇYk7~Yn xmhVLTS,䳵x&*tjM?#cr[EK\ㅞuCuV>R}X\Acrň)iYa0Uk lymJį%W9@|}\v? e6'ZƃEnxqrǥ[C@$ظTNPr|،7dۛI*١8dYDp"{#2N#ud.eNoa"}4C9+mRM.㖭t@@*@=C%O1`Yx䗟zyY* \koJ!Qed͎s͚ $ H4Urvp|9N&R-YiXדl O<m/nn}ɝ7?%?,Y{.4ػ)%,FD-YC%Iat砐`kPʆU@I=ը5}m8k8]~]+ W fy"f3CN]ٶ'^O$80/ѤG-9tI3Pugw4[ny`ub }6R_,TL2 JAv`,Rq-o>^-]7Y=ڡQg |rR*%bTTL- "l*2܈RBYd',)1ZeQˌXsR;,Ļݝ lp;Yu?2뉸+e—-tiNd+(@b>$$($#!t0ӳ=\ge:S)'U1;(J&UaTr"[kHK ʤcJAAK"cxY#eH)&;nٓ.m7?j1}vۂ^Vm *޸ nPh}4=aW#^?fxx wm֊4/2Ml@hS?[0΂|bn{uNg<dzwNw]'J;wC;~ygκ_&:ˏ>չ/ڬwd7݋n7j ygyKwM5K;/y n~"n8Rʋ5c @*P0B9&'Le1)b7'yĞCpIg"v$n-*L֣)&lXLl7wmJ4Y\|I"l. >madɣG2[n=-Yn' խ!갊,VE?Şy9)uiBDtc<^ϗo?l'ޏz^m)(;(ĚѤ58B. D%78j;If-1B9&3 *EO\0`;<2GN ,jIAR-<9-E]p]b!6/Wac%xrP:sޱ{=Q^8L< 2#%.$FoA{ĔadJS=C<2G#1"B8(e21T;"Ҏ) s3O[(,[4u /GNxpĂr_ućB?iO(Aַ^iѠmAFQ۞ dKR$UWN! 4^: YLspF#R|Tyc5W\4NJDOZn'3u9м|F!}"S ?4~>)S8wu Z]*=yph~[|G_Uxo<'o4\Şn\L6?.N4uق-LO?STp&ӫ8ƺedY8M6=ӌwڎ/meyFgËImEI^OURun4= ==8i]# .&64ʐ ? r" iڐX>2ՏlIlPJ 8|Ȳ/^[g^,f7cCHVNrzMպjbDyN2[,!]q8J^b4ojXUõo/U7<0X=+ /㴰†X{:X—~ݨ$Nؔhզ790sMG~ݎ{2Gd$ޞz`RAɬ[`xQ_~ |4)-~4˟&oԡx`K1ܜvonDt)&̀R}fyPa:I9zUbRgygN+}zF^sz3J]$"B@>$D($Zm=.2ƥ&$(@Tp.,Ϻ'$'fY πx]=p͔vkKUF 8p"3$ ֋$XxŠSe[jc>ژKآ]y:ד_|&+:Wy0ϫyU1x[ioZhWWk/_"W-/_wC7~W3k^^Lr{ZɞP%Ϩ%2 V K=+[pVG_oy(Yu$?Ayt?Ȇۈ,-ݥ 7[MxjW=^[R 䆚:+ /KC~j-.eZ\;t< (^CQT,6HH{޻]7z@i,n&nSq=|@^9/^~^EuPIkb๱Xk 1)iNtQ]Nvp 'Lv~Amq$Gs'VC“w^Q4Ze'eoJvLP#AGBD|粙2 <uEm1P8p;mpV[8Go5D&B!7 َ{zBkp<, |H~S+|UV}yrw7qO|i '#|`Yr(MDs-]ND֫RDP״zOrIo:~ʯEWK7CLAdJIP)x"+!ъR-*B.(W:X  ETVuJZO.F8@YU%AAY _|BI ^}:!Ղq(G=$`$JPЫS#Ԯ"g(4*!RB>h#A'&W6Dz} r]JQ,&ne>Ɗ %'Jn1:X彊"rcVSq*O}@7ߜ5uK.+RWB Dgxv<ɢ. t(krrݜHW'qdߣES'oyBˀ2. nri|Qw[~5:?M,:e-cpG)mI$(Ⱦ\V?dG(@5UӜu6y+V "9_um\\qSHnǭ է=,ߪ+o %!\ٿ4^\Vczl7P?/2/^_jpGqRԒэ- ywKWMͰfe46),PCOJ̣gxtٺp~OVQZ]vզ*nuwJRGʃ(>O<0CJE;'T[[}(LOḺ\%ӻ7{&~WoޝS_W8 G`=N 'M4@GzMhilo4bwz]RfT}WЯR8_8|7u (7zDxLPWS74~''tJT`M`*ц8h %gx_~y!R[E s(LJHV3@Ə7 oBEY7vZThk2qP( zރ'ltˆgĞ^(# WdTJ%B%|)h:=*#M % jeA 2ϸ"Pţ=i& Ϝo;58eWg˔]ZlMJ:d{hz^yq1@i *,oAHDzKF&alŧŮak><4P؎t]!`׶*5X j&NW=]}5t{z8AWKt 5j']= /KWCԗckW}4GDW }4tp9%BW<]e]g+F@c+l ]eJ ]\t2J*{ztʼnaL]e ]ehAXUF{zt%(]!`h*Gc]!ZNtQ23+IJR  ]!\f 2NWr9ѕ])JInԴtw]z;WR*/eªK|O]h{LOŖT$'Xvݲ]"gAvsl^PM:Y%vnjG|2Lk!Fܵ- "d?ej~8ihѲp5HpS[*}OǍEԖf+.Mݕ&;fզ'%]Q,;4FHEd={0, zMAB8Lj~<'vZ"0"hFˍS*TGNqR9'6I&u.08Ko T$EXhgnl*b=؇AH@ 93PXΉYm"*GcahENZƴ6ѶQN8COc4d)%)}pJPJ,Ɉ-)02Ra#!I-nCCvA Q5&ƀ4cÝg)W ^xIYK@G@!Y>U"LBƱ"ٻ:+2@T]7zI08~ ʊ)!)3}(RT#-. ?ȼڵZսvp#X%cQM:F20U`Kc,cmFM#d *GJ<D @#.DO$~sykU 1WmQ TP,Jv>Bn/ J BާNWUA2țk#8sԬf[ ރF%k|p64XÚݜ|OZvR=cG[G7h84X6cBFUy9yտ I{mj#rcHF 9yR +7g ,*R>` `ёJb=w >B.聾0Q9_|&THdw|"0`)dfR|WZXO h;onlX "(e*9<Ԫ7JX ypl5M]`uQ6X5 Vl.q~o);0N-`ǧnuAjc4)@F&E0 W؄vI:- d>QAQi+Jj*6 Frf2eA Av%4/z3|W4ǧ:v(c¶ߺ 1K ( **ڕ"M4G@TCJB QCK@T:*lL6 _ gA+l*8(@X惟߹P/R.p(og͢YATϯ"kGAXyǒ`:ABb|Pj]wJDfI]~ AІ=x_ZU/([gF\B+޿ܢ{~ R5xlPBs|3(R(0B9`x&>g޻wfWV򃳣n}`mFL`-$L>gAuP<l;@6ޕ&*fFVXY4X@(&=:XeR,zGpKR-a^b uk+<o 1t|s`¢JrCk4=Vdxe&[_+Hjw1///7>~Xw}v]8rVL-V  c.u%נmXQP}TPGQp/A~,hpⶦ3-Ah[R} y@5 bu5 ,h-]Fp2zvhu<> 3r$›2Бx=vvE1(EIbeҪ*(- A5qwpDipp&.*b,z` Aw:+B ډB@3`5.:|X k\#YA pf5ڀJ5񭷦:x$`N)40\p_h5w jƐk9nm/̃;@7m7s=;,%k`.`5BgekOek0=wgUCŬ]ǰj^kIQ3yhY#h4vfc-図=.2#lRӑpݰ(a!/{úSiCbDuN M@uwthnPQ Ӫ #w(a[T1@Z*|TO=A. lU(?<66Q=) ڊ[Vj|c- Rת%_63će;\Aj},?mAYo5{Pqab2|ƛk΁ >b1>i1»*]`)o"0hlpy/FWfiq0Q8I/= F6XOQpJiC0d@bv-…Pi7bt~њ-jUev4&ZTtǢhBM:Sj7pͻ .0pf@Bׄ ;:wW,*ZP^)cGA远▋g&P~ 7j8`aP9~J \vC?*`7tŌ>͇է?U߹푪Xa$V$4I ]Kt8@N;$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@:$u3%~' xv$?$ג:$PS$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@:$P53%Ѓ' OZ2O31 $$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@Ǜc'Jpj$5z$%$tr&Q& IH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ tQS {}^;yFK<%q~.e\:`a"`k9qi~^|bA ]!]EkSGn6ի+Fi7]p- טYђZ;]1J9DWS^( ةCCK0tJEWa BWOzꁰ~wݪ\V.~ߞ9G!wI98G[_~CJMM޴ ::>}xz{__\ԟzy_l~mg]wY0ݧ6%J.m^=.ϟ\Q. ~fUN?̂>%>zn ~AWSiˆð̠dglMݞ7t>g: gٍNutMcCUMq%mȔtpjƸ־q2mn.Ƴ}F>#W,?FPkmP&66ݲ 6dx~wqqy/~{zri|Exvuwg/pZoODr 0धܤh]Fkw]c0b'+k,thP:+tute)05ab_[k?BWCWN~"CY4b"7ԙ WCWV4 1U1D5]phi hZ}Q3ǡȩ/jot`tpBՁ Ү,(tԩ@ѕ|$4tp4tG6kt(1ҕ ع^yMzZkVBWGHW65<վ`n0M~t=Ɖ(DtiRmpk+F\R~bq⣷~(:BB4`QW / WWW]#]`H:v* ]!]Xtn¼?]GN}z#s;uh3\}ཫкʰ2J{UzGc񄎽.Yd'tž'n#_j'<УRgSneW~(^WˡSJ$Y+?Ambpy,6Mv ɺ[lFb6چ&+6LCW ,thC\;]1ʔ· i4 h5P2BWGHWdMDW i0 ]1NW@锘c+g|J]pRib<h׿w(WXxʛDLDWw7IWõ3?Zu?ʴWJoQ=WiSÛo :lK/_C쀟64Gk1%ۼ5Zӡ?a}8g=`5yE4feB֭deBGHv~"b)MCWkk+Fiҕux=]0< ]1\; ]1CLh vbx)]!]9#k;]1\+F #]y]i7oo\eO񙛟^%_'_^^sõ ˆɶaC7۔QF^Yz_=νQV}ス`tl\?]\\|7WW~/zN$c. ӌ|o2Um[A]7Sȇz}vSbs& GcԋQCtokGC2-O'_aR«' Ӂ[I|{c}Fowݫcj{ɍ_mHr ${_|%Z7~VKnjY[ib,"8hn¢-x"/d_iMg\ÿ>~jjaN?>#(i7P0y!,q$O8ar?(* sõ^{U*uoN+\c L~HE] ;!_u=omdM[ ˟xPGZQc_q;muo28%^SQ1=SrEK1 ]C>zBM"5+k_hXdwloS?+n)pt]^Z&KIznl $!*)cD!⩣FksN?@&3־2;y,_? ֯W|@Q?*bzBVWͲ]իד "xT_r}3__u7E*QƚΰI;^(Ecl!}9-'E}уtFt|ٓ9 YP {sXFl`,geߌ|ޯ!JigK*[ou'w,ؖ-P-Hzp>Fo5Fw9~ƪ³U%@}q{f(Ӿ[]3D$tXJTQ3(3xl L̄I&Gv)#4 qIbΛ .of-yDlaRDٱ͈؄ʂ肉_:g8oߞ}PI9tT|z51@dR*BD4%:[· X`r=d}R>\ck= ᗵL8'ƩBQCD4F E""DR &Q`'(5tO ``(˨llew^o}QMgxUc~.C ;(3JU-K5m4NM{xFpt*z+KŷBN en=|% wӝWfXS̮~3a1w\>,< N]Y{]glx{0zHqYMTb/P+|ԄݨJm0z2zmЅEs<`J$bT|̓2\0H`Vh5Thk1l{]Ų}Mlei[ O`U Jв9RSBWq09.R`0Ǥ)"M.):t!5t˲ˍqyeL#[L\lfV{9.NG3_EڝT&n#%KT@Akh"锨̵@98F+_hbWH,~Xe)UnSv\VC]yq2X`H~5!佶ͷ+݇?Ridu ZvP➆)ⷼ ofk;1N4-6by'ln-ҍBʋEreeD+ pQZ-(9[RQQbQBJ&uQ8"5BK&8JF%HcON)TC,d-ךRO ]Toa@t 4#U!2ugXnvIb*t}*Zw WȾxfz^)űGG(]G0b5@u.PI\E-@FIBXB*1ƦGFvSomc'۶;8F/P zH땐'EncP[?-n}qКG]hqf )-P F%(,Za O1\q$/Hx@!2^w EH|BD 0Mp&"[T9*E d6d{lms_gMrA?,|һG%6he䕚AsJ%oE9+l A{ v'o'FodnM=;."ॶ7Py+7 }^$ub6X(@_J.l":&U8BT!(Df }d 6N0;(QIe R{N#DŒaE]Ne%ڋqNSyN "} q6{wLg˙K4 .rޡڂYBgI;0i\@"Pnz33peN]SpdnZ_Kk꺳^4<ѩۻpߴY*u?]_v\e[W|0Ru`zx ^HSh]AU(NĦ${˴L{D-S"7SydyGz\1RuL*h$Q.qmԪm D;MM ]9CZ e0*QYíc1팝=-:{8\ogQQF;χ ǯ iT|뤺-h_\0A3Cz}t Oz:[T4]#.NC3nҙg]!ϧ#"tŲn؝2m͊ ,dQKdwuw'^ ߃rnyO;譞7k>26B1=k^<<^F7SnUWEw,hStW%w|golɦdה<2\ٕ\摹=+3Jy|y(BKl\i{\o,~nw3>vx)}F كe9/M̧AZUOyޛg6x)N!Jg:f 8ΰj pYskb'PY/z1x X@gC"I.Yg &>$e([(OT ycq> !;j\K_}\UJ¶MpFOk~ή- fǎxvҧ6 _xq-˰P+4.דѢ<2O [:Dtydjcuu#t%覞j-غYV1c-WAkbX&W/.RLl힁+On7.sf~/н(# iID0Ʋ$D($5MhR@ -~O|py>8Dup(3&O^f*v-U!-PŁ&X^ 8$&9NjϣWzWzY}(ʸ=S^idccy zs5n'{s2v O^WԆw}WJ$ݧ  2Rq M %LK_8YasJ2AqNjDr.gN+ WȞJzyi ja2~e"G{B1AHh# BE;02 idD9A+%> 9Бh\Ī 1)p‚T?o;8:d,695q}%e/=3:) kPn]3\ϩbGFqҫ=jN(/kB{9qR"H ))z4 )URrɈ"1#&~v6sO.ϳ!؈ 1d}r8ì MoDF34>l'-Ǝ,I+BQ#"'U%pG =KR Qi9؟=m#GE6*0$7`0~૭,y$9gݒ˖v,ŝjɪUbUk(g\EbtQ}w0j hUR$i-jkD@0pr%lDM W'R{&H^H*kS˔d43T%"NjM Ag'uvBx{44Hb sZPWz@Vԝ TI^̲ꀣتxb=,}k,`8>d*IS9jR L%+ke-Ɋ *a!#BSUbIh'"|Ⱥ§[yU(BN"O吽`ѕ|Ԑ #ŖZY94<*x ^vZlP%w"-YW:La|R+#U"JPۥKPjE` 61i[66{e7Z Ek#}B}e*@ BfۅFߣH؞Phy(9Sh1e!ʦ1j-6E4&xRPAEGߴ-FHK gP!xczg249Zv%{CKi[䬘yl߱6Á @}-LAov(Pc2 JR33SRRv$D)樍u cJy"g~4Ehegvuy3\թ:a"a̓mnYsB9.$C؃))D ./RtJY',!,Ѳm̀ }L&87Yz2_- A#dHcH%e։NkEӊLNp6Q@ 1ȶmP=91G&o|RbiavG>VUN]D XMg܂F-y" jU0F8@'ʔ8")$zEUFe ^Jx6NNM{=|ߤb ym(Q{L hhN;2)|(୍1` "Z&y*QMz=*ۖl~z|k@WШ4M鹲.~&{ߦAq{s}_618}.Q/ gKkrEgfk-k1rvOD`+JY3QK[ZZi)T'RT.2$c7n\.F-(S&H*g1fISioueնIKemvAf%"2YRB.1#Ž52gڧ80QA[]YmAR,ME_zl5Wm#t}!V|z2+J6uiE}j:C'd)X14#r-^[Cb <#)N, 1gU+kw +3fOF^6AZ8XAsߪg}r<yfpp%6EmcaWtYR#8Ժ8Y 1Dα `]^9K$Umu;6qto;5"8LYK2Y; 3Ysi(q3F0QGo b|ޅEnCZiFwN"-w>(''nbV"'i^E d1d 'B/pTj Ӆ* 'rf:m* %D6EќkHY$@?VPylB8 E]8ºdž!W:{ p)D%!:3E." iQK;i m9 4$GҖr eHȪ  9:AjW։Uh5)2bIG)`VrQGL^Xz&hy&=FWHr Ηg]پ6֊2=YuS->5W\#r3N9Ӱ35Cy=ۿ,>Z_\wjS&_U3-YZFmA)zØJp2s 80z}wI:ɔ@t? l57mJ2GRB&|YN @ 7^ЀQ#cpZEbt~92z7LkOt;s5FFpi^<\D S׺b`ѵq֊ZBf]\z4{<^܌糚5w7^OOf^/[R1Kfl^Gi ;HCdlYK,m -.kF,m&ci3W= bLQQ4eJ>.=lڟ9%`i];e}Deё2R*&8e'_W<|FN k^z(~"yY ~o޽-?߿:׏;z?_VLS(AHа1 mDU?yjijo4iΰ^|v9%~->ۋvkkk-@z?y5oNS6.WME*p=8J)^[e'^U*%XC!b=Ei#m:-JʈʫG&s4p=ؚ \kiAĐDVug(}k/Ƕ姃Gbs(lJ3i%K y lѮTj6vZ٪LwIweK0-zU41LXՋ2GM@ьRc;sՓ^Wx#n]5 l?Kɸ'PT 1bex̕& PnjR.!3=Ç|؞3|kH( sSZ)'7dh ƒ!&)EdoM)zcp[b:sȑgW Z˖nk8L 9γQ:+UPA eѦYbO(wI5w*9PƀZ2gomB([\dg[WLN=   DO$вF 4UɊ&cN/=U5Z㓵T׵T!NvҒ}ӷ|;6ֹwXɜo-7·;oP|R%}"o#T,/mS ȴpi[\c3#-&f'$ewt+pKt@mdl=]5,l3B2  Mϋf0hs>|Fbf*n σd#&8)R'8DH\ {%XFzEovFfs31NeE*7{dTa \a`*Ӳ@dkmĽڈGhx=܈Ot#>]e iT.'_q,TNBb tZ `gťo1jY4Ϊʺv97Y #BI^"Xxba">0jaTj]&6;{ݩR;WD~r+pUlPdW6\+"X0*r+pU׮ (:zp-H iWѮ  Mj\UU! 3pU}\~Rvs+i. \v bB%tpJ FuQxRؒn' Ku+G4Fe_c^GG_'G1 5͡,+ΦDn`VbۤtoY]BϬFMFn~h.ן{tٜ^{+Qt!:̘x.r,hFZi~&<9 jikʣl/l1 .K{}9.{ۋ\|ӧM]x$]֚׫ѧ$<Ȉ-QHH)$:jOK m#GE͗b 0ٛv,MF='bŒeegqjvUb(>t~d\R]p:̯v5tj8=tbv?r~C.m%.US_Y۷mgC2?nc/]rQgIzNXv V9&HB78ICԶ{n|{d^.؋viw+owwC:]Ii'd~M9ymu0g3'ZސSȹ2^DD!L([UZlutq;ЉIFy5qQ#RHf:BYc5 *~?f[1ਲ਼¼}Ji=$yxUAe/}%`WgQK!vVB2Y#/)˲L" VZːQT8V` !ĝV̖IΥGޞq$Jf7ud䅥)f]lAwg48Ks<;Zq݉,"xO\!)4_;?)DdLDEx:KҸ=j\|upZG2%߯DoO.WDpH~<MV!h!S{]>ZqZb/?\C~?X{gʏ텷[g5; s y4<9ͭs("U8ɻS7^N(ƺFs$PG:FtC/sUe 1yh,|<óC!;GQ5j׳jzY_/ <(Ng|[7+Mt_-?ˏ??oû?oi4ZD"Qn ?~~hCK|o1.5cCn{9|_z$Oǿ 'q]Df?zդxX@Wq8_!f oJכR)kbB a݀@W^{!_>3FZGk~[Qo) Le vJeK#Y@-/D h!48aID*j5q7tR¨mzy!At'YUO'x@Q+nP:RmP sXL&!3-F mUψ޳cYrA'Ȓd.E38"r'LNg&V iƮPW½µVOoivdoONt47_+JlN踔2@ P mbY#WR* Sn)@P=+B6%6L36 ,@sЦǮ&۟Jy0vMe{#;ur.)Ld!HA~SqDNZ;3VȪ0+.x8PLȐEA&M&JB H1q|̪t6&i*j۠_^c:P<=b,i DܖTB*BdH!^SF]ጡn3E "?&Z&EtC(-+{jNjO~q̍rմdG__=}Je>6k-uUEALDpI(~q~Pa58l7 |4kV8 ~5+;5ڳ26{"Xy5SW?.^4bZ`D.iǨ`w^4,Wr̃K7>n]߸uةqU'_ߗg"9ƌ2 P+σ6AZN!1"i4\7X6aY& #S8f@}٫5! 55Zoz5Hw>x3!ͼޏ6߮\i&4ߧ/.R g_ u `6/=y&#dP}!A`T"~: P'~&ۜ5FX(Q 9=6OyHuh4dє2Iz5Y"2KJaVA䡛5}H0PsSP٤Mro:|rΆu27b>Ĕ`Ż!x9 &tVh=*MI|SXePV6NI+ixdn nvù;}|x1=Z6Rkͥw<y/#"H@hPRMFm%\X˕32qfvBIq2>` k`U OS纴>.m:$qouPjq{ %& ZQ fV9H} dmp&͞䅫kåuw>yFqr7([VNBD#S 7֙BTqGP`TҒ>yL>U>u>Y(Rv'2hƝcBB9i9(5-Jp(7,ԥ9i 灁2H"O}p &˝HHTкpF }3,Ơo㝣W:?M\=]3{gT5l}|L\5`^L4iE}{|4R~]ӾiU{KNGis|ice'9`z,@RA2koHnlu&}"\21^ڤ[if+smtks}*MREr,+(,E6r-q(W1=XdWg?x ,i']t [ ||+zebzLxVGFfH,@pY EUV7L2|#a7C*{(mGYcZLSn+C&h!fj%Fc>fy_Z6Av(:)M2QJC3f"儁xpVjw&ɇ \zG]ˏbxeSG*:M_зKeGPY7ڦ-0KU,YfcNVhv MoQ)u(~.hlV&}ϫΎ%Z9n"̚jge'-ȉ^f" OZ[&1fIII2gee;&Ζvdf|6Pܥt)li`B%Si-h[fnx$,&8 YrZ:jעsG~f'(%DS ŬELIZVKd PKwΐP Q'qxw׹@l$bd*5@r @2(4L^2?{WHPO3MWhv= lcveхO[*K-u`*%KSNɔ%YL_ɹTB-UBE%KVB]"<rLs#\ΆĸQWRRSۢaj!F sv?U\Å+"n>Uxfup#X8?|rˑ)xd[\ji$A)wУ؂cVUawCw.;{486&ղ^]@"{,zg9(5waۃB(!8*:HU&HQ%`A2+J1|\Q]Z/ȭDE7V1_ AF|襗7Ma|*(Em 9 <b;mpV[8(Cd`i.(CS0$9묤9[Fs]w{ .2L1g  )iP% @):آFo: At[Bh )(m)D?i7H1r Ww,Yqv :;%d"o~cLylTxtf'~ӛaa:Ts.:@ܣMKA MH o\b,_ TWuѓx+zb rngozny7̝0ҜroU^TdT| wJ,Jp*@T^kf!&,) F~/Ż0"&ܬT&җasQ=j+ro rDJS8b n&iCEϛblDPHoTBRPp}P$+$Yxd+S1#]of0EQ.O9TH^ xRpSfpdHfQX#3hk1mhh:tX_VOdz!>_xi~ m6E;WF VX @<,rNj0HHN8s$8'ZU 1kd4Jc, $J^Ҟ-U1[TɓNy~?>vzvͶHߢS:,OY d7H.%Rdj=7JN7gD֦,T׸USK ̉w0WiT12*A%T&2S #)735q &f-Z)"5Ä:%|$Pj|aM޿0N>6's8Ipz0B% 锼=eYBg'vHL;"$J|%Wkt?C@_56IH+%/pw.FetMe IoMm1<T\P4SOJ `* ىؔz^35ӳuҨӼy*8@OV:h[K>Z bF*זM(b1JHL451'(w[ 9`T<j[N4Am1r:j R?j7ߖPۜ֡gFl;-Wo<ݒ_Lu"73]7pǓygk%|̞ԕȝr:^ vw{gWHSx*l]azuh{|wg:]y>ηYw-+lY5ڽ]QYhZ,[>?0.<:kۢWqxЉ\ym;Wi}6pYH+fMsu^^^#Yr<$f4AVT%MlN5uyvK"Z웣8=nP,[&\WLd>X:n%l/2I6Yg~;P:'ŁŖwl3|n7j&>/^hzX3Ϊd yc  82jV/ IVs I0hB9RI` jeMp (9w U`Q@5+x=s yܹI3a)//"y<=[-;v^Y( C$,F JP$F$J=n' S'#WʝO3oB>ZEO/yޚg6xNeH3b3pgXDcuJ錳4c<Fﯷ]rq&8']NO ^ IBHp:ćL ş()@]_lןa3f݄-{+(2z@Qi=Oŧ;R !7p 6&)V$QX9!8'yZ4P{ DJx?:674@$E/<82YLsF<#P\qq|fߎIOBDenRKF.*ӘFǵw $; +Q½y:H]񗇅n9\׎O&.0=!n._t6(fclGmf=i;f$5}k^Ϯk+ryj1,@*g@[f3j#xAZu=^nV@ X 뜑-Xw. JPNc첞$#Xg|8K{Uj>x^N7ShYOc1"yΉ=4zn]e?< sޏmkX ܠ<Ӭ^v[|҃arCs+V|m5q: !şΊxۃaOMψ(93a7-B~Վ JC3/8% 2TE*\h _Zn6l?LY="ā=ʶ=⃼+Zi6d~a{E;uuƙ90dŃL)ٞ]䧖K֭˶ehVnzm}ﶬ_M/ǀ.cHǫoc%bv}mud7/{A򟉍Sv'-CPnUd)OS3C}$46uLVAGvI5 LRFkyA-;:6{fl҄fйbo)k/_f"J{&wM$" Ds/x(pMh㠉GؤHDSyc{Szy~ \? ?z缫 5S -U!-PI'.0#L2.`HApJs@Lr\:z/z}fq~?9xQ//L̸q˪!BD)/ք_PL0S%0_BL$^%SH_T~a(B}fDP^RW{QˎMmڋJL\=աKPE/H\!W\/E\!sWJ{q' q/E\erqUd/ޡryI ,q fj}x\\e*;W@9qWH.% fj_.2땶{qnĕRI}A  frm *'2BTWW\ `^G4,%> m99ٰscO]n >']G:Vއ5S/~x=<>?e6hv("uXW `{T&3'é?,*)2'vc5Nil8nG~m] h㡿ta22 X@ݹO?pjG*MDrJR f5s)@ ]?^3~j)`4Χ@|w9G7<B#HoKe~w;mkϻwow-!1r~Yy8ov!ۦ >X=f^Lz*uX_eBf~^m4q`w8$:OZujնݒ{:kjMW¦$OkCi%޽֯l[)JQwW mB!q`/)H9wF"F 16τP=2zS%\ ׶{]۶- P ֩-/&28.&J_]moI+B>z p{ vp_&*[EHr2eZܶ3VjŇd])\*? w[!~ xIԸDw>״%ɚR"ayٖzJyꏡT?|*JrXZ,K%/s)n ˀYHd+#y)hJgJ/J̅7" N&"3\S pj+Zj~~%p~T7Cwp\tϫ_t6 1fȌHRͱL Fȭ6:~YH,(mTI 7sn<$) >C S @J57vk~+l-k߬~ W{*Z0Eۃ+@ ¤L,[z4]$X%WIaW6O4jmJq+hL[r'-NVې|%/nF%NH{V" yv3%alŃ_n+uw;SC%,00H4 ‰B72,.2A[kr$ϱjSRfs$թg &w9C{Uw~2`®e?+z5_.= SYA])} A$N,d: j3&r%eL`x ZSR]ʚ[ybZ ڔBJFg}J1x1 r/y ^ LNmotDk*خٸ鸂4,Jbr~,[sg[k\@p[Բ5ŧ0Nj4,H7pwҳ :]Nis3{Y[f]5OPgx[htE/i:DccfMFV4j;}7%(_Jⲝg"Y l v:>]O[O]8 =\of͘D"9kPo!6:ʷϐy|w H;Ec6~}nl}4G& Pڈjg9 %]Tieh4 l{4N ɋ$SEBF?,ϱ-I͸^2M'L/B;sѤ(&àxd6ciȔ]0[M:1X'E1 `^08ƥp愬.ג9 \#hd9mlin4w:%PSe"p'd;J?/ɓE!rC4p<ln=F{ԉ@ &de0H$V%cxfeV3XfyPH"je$GfF2 ^$2;AHq<}#L_4QF{7A [o^AL03J 9?91g'Pɜ 'siM`}PeZlWǡ8CzS7a7%>ݙ_Èn}M.4ug6f@!1+Pƺ %Kv6:.jᴷ# r`f%ﳼ}9/'=Lfi،t;.gJIѴQR GOӺdXzKR53r0i#oy'( @i-Md;Q9{x7]nl|+i#)l{ md龩o>Okc9;L[{.0S&6-^@$ҽp5[\]_Yxsk.(x3FWHGR|CODg>o{7=#.Bc5y$rܩM fm}5u וڪ"ץ5Nl67r>HOW־팉tZ>1O-[&fIn@%w6]剟t#{UORfee:kK;~ZLMZOu}@~,~a96r5ya)a#3ޔ-(5c%9HBHOjY9%ZZ 8iI[= (*x +W'uwJrJdFH\e3^ IzR{dL-G=.˴>4Ir*räBBuek%^FTyж"""U^T79VlʷlE-%&33$I40rlUo8@CalpA^luƶ'nDI_rv}AЫeYX1LԠ!U=d[(tikJHMAPl2&+!/6iJv׹yۅJQb4`@JFU#-" POZ锢wQ"3^ -fDR2gSIiZԺ!Р{O NhkyH(SGRdbڗ}9zUi_t'l؛R#82f%BGPesv2e&3\%m.)ЀO1G|1f K~}<^p2Mje eѓR: 9:el䙀Tj+3&3An0wg~m3{g Ȗ 5]>F%:gz?z7Zu;u{T{!ޔt˿_e1}lq8{r:xҍjɮ{?>[OqM09H/j YVYHag}uulP}PHω!PDv6CǮ IN6=vZۆZ"JI+o9tB={OSZe|Z4gcz0!, O}rm/qA:92hH%$}4EQav&z  =އPYݷt~z4*eV4_ȂTik;G݅#m~n{q:2mACZGQm{fQY~#[T[rZ0=h :-|!iJ3HRT[rמ 2<9J҇JUr~LbCA7RJsz'P~MwQB Z O!$l,;Q*aQ*蒊1Ee rD)[9rWovqZP W>(}! 241''m`5&--Aet)BpP6\-Y*\-fJEXYš$2NXEXU F,4"Rd.`9*${>`"M:8W.fAt$%ih=1IrXU3+.WVt "?\$ AކLBBROz}V¤1)J`{ѥ0(RtLQޒS=K0,z{.GF2SFfN <)fSi D\ؙuޯsj<͉XV'4Wr%"rVٗ:rZ$$&VNv*C~`|r\cO/=Xa2"cO2~&KkYr;%8g\2q6ggH"gaL`r}\t&jmuCj +{ /o@[,?Ϯmٍ4!cPGS>э,(4}#uhjCo }{4? [%jތRLm6fMw&֝3Z}ukO,;1M[7s|29Gg.N =|E;k5欳%sGi(_yFOJD5O{wxlKDqK5o/MO/ϻo_zwow?@_h.S([Q 7uj:Ss5j78/5/yyO(\'|$8S9Oz$xTDWI<lgt$\}EwQ)o,ŖB4nBNsvY^vp^C ^Q$yj"z؆ \cō!I(=uayݫzc#'9y6gcCV%}Š^ZCo"ŸRwةөll♔ge{y_t!7ng׏&B2bܓ'm"4cn QCAҵ_r-SZHt~ΎXvK*k.;F"YF:ϱPNYd.s.f*Kұ ])K:I~qv:2*tͤ>"KJ(]`&Hz]N6y>3)\!}ni~^}OYoLO:Jx-{K% a$h$ s:`LBfZQ 6hr{"1K.zR|"K\i•5qe"X XOTj+\.<(^#*v(5[Kjp~|km.dz?C/321NYR dFh?X6+)S$)Qfa(ΞЦ&h6LC<`eƮq瓸bvFֲ]Y+X{P>%K/SNYVNA );`I7E~NdLk YyȤ 9d25TI`$SIdǬ" T'jw4q>l4T+X>ˆ80f/DˌUD0ԔfBdƕ&CN90 F`mf3 oHp5Ͳh@3>tȄ!'-Hg8C팖jp6(zV{Ys5f%;"ʼh^x}YG$|l6tմEIXJ kX2ڒu]j>®immv5rlҎyEAܐxҤ*qqotfMtT}*="0.'ߔD-o΃;a{66siK9d<-3:Iϔ%RZfi1A) M4**]jwE YS>d?q4~6M冷^*Rt``xdRϣvh۞C\ t%+9η^17{/Gy$Z9IV]E5-瓴v's*oao]7:/2k. lKß&We{?}]\пB!WLm+?;&$,6(hp*YN}ldc`F $c{/DGgS{&Z1a)ֽ6S5Y#$[uV7ܔgV]AJ|"3qⲯ%iPyh26Ga/vX|UNP 5Cn 0x-U$|͡G.vl7.65.h܎6]T"YF*p- ]Zq]R^ ]I`` ٟ W Z]zt %go-hꪠT8 + L+nzCWUA tUP*= + J#* ]BW ctUPr> +4EF*pmov -Ǯ J9K+TB$Y]noЋtUP@W/`}RW0'^**hiOW%6= o=b(NW[&<2]mC+DWۡ+؂`]o=WVw 7tUJ*h:]tJh/"F \荺*h :]K-]"\ `kzCW0 ]t J`]@Þ}]7`Aw J;HWH+GtU[RXo誠w JztEZw"V*pBWq%HWt߅쓺"O};PPZ8 lVw>vUPjt>,̠Y1v v(ZP/Z+uՆZhooc(KTZ9iDle@^|+]u3XooevV&|d;Sb 3հLF:/.^Q=]\nƍϴDV Ok@"% w^4,O`+:o@_]NW~=1 J)σ`F Au,Kϫ?B]~u?_{I^x3#˼|#zi#z3> OßWkʾ#mOz:?ԡ\m١| cТ.}t>N!l39,u"`~:+?>]kAljS;[}9bܱW=qN?ǥ2TbF'陲D Q,m4&(XEC7K2P&Kz6kJd'>QMo}qn+קkyF|/H,=}ӶmMc#_+9ɬ[B| HJThqo 7$7˹6eЛ"82<=<'̑VOv7SBs1=|;ܽCi߾n:D Z€ʐcrYFc%Z8ɡL#Ӂsٵy zgWHv7lSz;b#[Zn:xN2ʖVDž}ֻ࢞t-}5;79D+ȹ /T4V9oH} d0ְLss]sy@.q\Ι w_hz%>\ 4N%X0JFr1x4 O+,Wb);x EN˱YPTBeyC\iem# Uĉ{$"灁hMgAOg BF$G_9J\ zyo$[Ʀx+J '>y@r88##V.}r\I2G,sL!]jX^=U#VY<@['kOӳ-MZ:"wqid4;V37Ӵxi\ʊVXLO ,Ho/ۿ <Wv.\x<ٕ=}{Dm--vPZGRZҨ5֭z?? b+e扷dHN6^ h΄ߒU:rmJHHxg 23lV^fD-SjR"1H)%VN8ō R Rd܅Ɋ<"N?5h5Nj71i59ϟ\|ѣ3U-HLD6YD(:h DbĀ>fA0yL+LBWWӆg_6<<2xP-{˽YoH ϾKM*l γ*7\Rb.i0b RJS[$(#s"l't󟯊t\onJjؖ+ H꽽+p_4Dޛ+b}xi^])k[])k-Fy ɅXA CBv7` 4>ܨOIzCxCG7$!NZK%GD\ZbDgn%jRo6r@5z\8&3BH>u*R2WVAZAHb=\[{6igާu~8M=p[ܔXA)JU]mvh]7.!14ɫwuq1 k~Wq$\$"`IKYI8Kz=kx-[vU>t!=eʬ>ܝ8'wsw'eizg4!6'M[a_MP!Z#C(\{ZXȄCNQwś*[ߺINhcT.q0%b|Y(C5G]gm8/l`ޯ|e6sdqԌj-F2v|~}>eoĸ$в͕2pgaaf!9ε,#=MSW9BJXSrRJ\3`LB dHZ!I9c3Zϐbl ݆yCk˜(* \;oYgMNl(z’'bTUGL kfS X\loh~3_O^̰5VONTm W{ G1JT3YW!Ҏ EY_&/yIcqϢ -c{; r 8$?;q8Cۉ;=l̽ƜQq $vm)-So#VE<Lz,;|%S<]@ϖRoW?. Um~I|&T4ϷBMv=BŢLGȃ3?sjI%}e [BkrLvb4s-{EaE.&*Pɬ Ɋɿpzвr>{aWme?6Ck_r{8?g8=<_rr+XSUF"jgMR LUXDCIb9aUd$ ``Ue:[pGq7`ח7u0j.{b1Y).?\ wwA*ȃ3Tr<5g*  N?bkBYєQ \-g-/!= bha1hj%].*Zx_6Zb6jױwG/VIytAQ6!:4I)džPB bUUq)ץLuޯ9[T7'c';nU,7S]z܆M- w}!1ڀ3.zLVaHRA9 N*պ-ta=Fd^) 91\rz ꢬZ`JI_ *$ԱLN[ʜv˜j[NR)z @wOLdB돛ܚ/&aۆ^wom߻>AH=/~`md6]fޮE_^ldҝt{m m*lԀaMϸA!671Roww/y>"ta.:ó(wrbMg?|\w<r\ငOy gD}UswNMW!۞yNn꯿ޟ]p`:W޼#%Em&VyzpA}4TT0L=̎M؃Ss}4w&(#q|0k5bv6opt!۷FK!hY̟|(iZ=;tm-xu0jџVwbƋ1PCpLzq Ng5Ȗ)^lbO.oxq cxǀJ[ja @9z `Оϑ-:x>'g)22 /p̀7}.B.lξuX5amsrqV`۬d*#dG6պ)UE4Un'jX.5ʢQZ7g"nPŸgE;_7C Ž&1*N$G%ʅW-JIf3sVcAOwSʹOm|:&I;;3Nz'"\Y+SуMQ8vr [?&@xFUbIw'=}VF_V~i>`mb'$WQ**ZCH7*Ńm*3;*L}iiq 0@:(ln_ ":-X vZ)F1\ ,-A΁vӓSS"LR"w:] \Y*![x#d-ʎƀuyؒbm}uB͹)Ij#02zDQFZTu Jn5.ޥ*Ng6㞱V iƩ:°'/۲/q!ۋ®?fZ]:_|VTJ @PBZvs7٢D'U!ho ƅ%mu2 lTs4ipp\3Tv c5.׶;3`gm8=6]^;s6v굱׆k/`''m/&lwI@1' O6ZTh>@bB]KϨhdAI8$gB G݆snHGssc.xăD9*>F08tb1Ui$R3 @lOX Q+֮Yk+Lm~21J` 2 6lp{Ίj+^g7-9/://.~'"b]C}+@ {n(@\r]*C./vӎS+> ;Rf3cW\a6b8!n~|Gя;hfTSh3tۘG"E-8C\ouߓߺ[\aR~ٛ}+?uV\ _4fؤ!g-j(CV- [Rڄ8q]pMK6H\Œ\,;Qܤ@l= JKT:s0 Z_B[Oߨwhͽ)rFގ{{Y|(]6R"TmRhSN5i K1*r[C"Lu+'D~Gޯ[|*7]=6d64d<% yZXwmydy-M6 ${LW[YH俟b[dԶ[o],~_X-.\j>y/,_w}pR_i{wzp2oKm8IS)YΏl`ԝ7k[*jpf+5_ϯW?BG3ߪ6Ywh,Ц+.w-*%G71Ü_O~(|a~Y}jm錐 4^\.斛۹]1ե/'wBZ2%!t klhs3bIp9umG7Dn^ ;7VA64Vݹmu bFʴ(>O>Wa+ EdqH( Nj?Dqۯ?~ݫ]۷?o)o_?|pp݅_с;MO6U^4bӔ.G]Mxv)r ]>햄v{k83oÙދ7YkyS^Ȋteu=PdWU~WVPa@ܘzi~AoWK'JY;:QAS)x :I$n8΂,X$Pc[jwzǗ6N#0))#yHDX͈ ?"GcHBp(,%{;*trsRP;!;>ZƠS\0u-}6]` ."/^ߌoJՂ~6X|t^*\-YUΟ"xO~`iҦ芼B^Xq#I.B;PfO:G1(r~N($C#4iJ|' a#X#K}AdD!a<;/_H eOWcI H&B8J ͌-$j W5SN'Uȃ= kAD4.svjhHD]` dgqK݇4`6Żxip%-fϻw?O~ޟ5$Ǩ-"1y,Cdžnn/-K^{/De«m۵{DPO3"7eJX՘,'-נC`Nxk])h!EPHor*sD2q.1fEԜ*-cpMdRnN*%  AD `u1M:'H]Ou4c\T,AX!$%-H SzN= l~>g# ,L61u¥^GX05U\_lڕ/P,WVFdAnxejG^) R*p+BE8"8" kg}N3)T2*AxrNb!k֔z"\Іć&  DIC0Q"NZgi{yebW`ȷV#E{/úH$)H9w#TdgBuAZ~ٟv8pl{0ضA^m0ʿ'wѮmi18>FJ*[sU%6Tu$ /TR7?NSO8$+ XDȨW']NZPhfSb8FwOܝ.~>GǪw]v 6mu0\Y@qL8 ǻ$U+$x,rN2=RD2H2 qHpNHVNW7T&$c@41#$D/it*Ř-.xw֝+x+zx:SSgzW]ZsX=|"fU׳Sf98.VR7Ȣ(aB$|*0ATQ )j#=e-O[rQ!hԞ Ku)2Ic0BQ$|$(qN=l%\M xU~cg$yqwSq̉Fw'ᡰe!4vDPDlJg=3驞zȍTp$a>Yp`^1| YIR#U%-SZdbh18A :g@^)P5@9κӒV^B`z2lB~<,pl`fNf[k@k[T.MSg7/::3D=UWp'N:fsJg;k|{Lv^\ vWk}gg ߧ`KϥEtg{*]eFC{6 d%JZWkz|[xU5-ಹn1y~myV喝N*pAVFuhïuu[읥qS2?7WQus?κCb&)'I A,ۂIwgQ'sklþ9:MCYtop!3!إ41\Z 9 GGm܁E̼֟ܔNDfglluݨ~'ർwoޙD!FgUc%*AQ @acQD,Wb+^1LM(ȵQh[ޱUt5*̨Gi[LBEȮGo]d+.c6Xkf O`ŖJ)Os@Am(elqc/NZqȮaKYX7[-,O@لK{P:E&& 'Rvq߅F()&a A0t aXd B%e;4ISQޟTh"RQtaה=%Bf)נ )*Y1bMX?IA.6hm2C"'Kr"Ak9[֓ޠ>9[3oxmb:TB Tp<( [d*F +QX ~G "2:10Ih'[DJ!ҭpv5a'aQ"mD&c=.,Cك7/T͞>rP ,O!,Ѳ} =VfX D)m1$/z%Biu(R iVX Ak"3C6]Iˋ;.>w[zc3tg>98>cNZtcn(iO'<ţ8[`_]Ԩ DΧMCHCGZMv 9ѼKn3KDqKy^]:(>CA6xnB2xԐQ@Hw,g"d?\BaN"[ə UlU>z7nӻY?-Ʃ5 ޝ`}ʝ6p46Orsݻ{/_'&=Eڄw G&{|a}N CmTzFM0 lP=J=hImTORi$gSa/VՔ*cQ6 )1ppqpnهC=r~GŁ\!W%07 q5K GHRbYKtyȭ[1~a/fjk_L,aVDFdCVgTp`$%DdςQ*p͵ cqP<1E vA" Nzy?eL*a8*bW,LR!`ombtHRkyYޅFTB)XؚLKgJ ADpJeqlKPdճ)B-%ASaSO|0ATl]WS~8.FkOC8"zrqZV!fTgT]u7NsH2!ɺ>_jX_|./в"6#O x3㊬~o8J :Glh6g<%F{WbrGUWx^H|oX1~A#9X7"?Mqty_ۗ߼~˫}}"=yח'_}ͻ_x9L MϏ"#xo~C;l54Z[xxh)[&|qn=/> ۛq;v7{k'@ZIf4[FGnkU^3i%][69k$̚XXKBȊc‡TLLuef'1P ]#7aMzIz\QtɎ[$?QhXpR'dUS Ism%vur[g/-[^Js#c7Y.:Ph292Vwr֩䃾.e~ᳯVIAWK?jȫp '[o'߻ܘILW|d(T6~\~}E S :X0gNfK} ^]2:],+ ހ{l- WQ=ER5m^zJؚ<\fkW.&oHMu%/&8CMKMވ~qC\\+ q~x^My/7MB,mn ]G#=-FZ9m~?E ܗ}Kbkû>V\i po3o+u{vBCQb` s{aZzp?a7ii~lQ{OS>(w|?71 }w_Pp65`KfkS'odRAbpZ,F^b^_;2HGHR| 4YqjWGlpj/Z9v=MR}M.xT(ٖ@L7Y]?^ 8%@&㬹qzm T=1/m"@P姐m?eȇTLZ7& DBwZoQ_Hqdh)Aޓ!5WNDB] )H9E~gpBi298kk7F tF-39(k $`?Aj؁0"ep% ~` Ü@#8U2ӉTIXsA. 367LkTRp0UڠU=zˬEaek,WH6"kݤu akt.Yu0q~ ^y^ۭ^`tvԮLێ-)k3 Z6YDPS\tBQ$0z`2o6:7  m@SpRh:F 5$ў繨)UFln0< 囏 3| gxS9njn>Glns mI.BHwЩzPTJ$ѥ5@=N( j=-jq^#úUc4B?z_a+@1\ƒr\ \1{B~;?" QV0 .T %p #jjd'`jYZ:\cGܠ "e?zVGtƲvH=Xv=_85'EiLA I^|%4.r-c|]r5WhK B{(*R3!P56J fx Q,gfFYV2Zc@P+hυX4SpFT%#p<{ ̠̚ҡFXm(1Lq'@󋉗 LZ6l!&^\U\|3 Xv)&t,ʬ"D.P,9vS-0 ]&y]]Sq1yK6B=^8]n&fkT+S)uXPF.DfY/zvQ_ F˜Tߣz ^\/tXjM@wϷT!P'|l(Y랼YI Y*@,)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@1)`2ǣZ@:R}J lfH DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@|@`H)d(`G +!%Bf@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJW (9&% Vx@0WˣQZ˞V:FJQ @"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJG tsk=w kMNNV6~{}\P ;}4^^79ŭ3A.JM|q}46c0v>`ג.gN߷}YmGߞ:@;C޵]Bui̢ OWO/'M͞Ef6\7{69eU`%Vw^yyK&؛^Ƀ܁t;vw9c܅|ޥ.yL?Jµ^4VfA:X[=3\0 l _;C~h%[47\>X/!nUVqX!]wЖ57ʍ|[=g.[arԁ3@7߶`xFSνnyM ᢭pt[tx;b?ETr=ÒIfݬ!¢=/'O;RVoZY}}" JW'jAu삖5jxfĩc[;w7N}Gbɏ>6x}Ir;07?ijB(G\{yCܞ˓5r͕{7v!`X||<6kKh!apY9h`\ȧbvPi-(鎑 eÈ =ˎ>n{(Ls6ɜ=oE]>>,9־'oP ?,L?xrH+=VL$uHe aԊ1B~9ȶLl:p`I6cC6NqQ<"ds#d!t9<+cĬj P=[gIw/_QO~.꓍ۚe.AUU5`tvdE$.f$`V]:ߞ|\mߞ~^;_rzn"lv~smZfNk/â|:忺y>rV/6Uv1~KDJ*|wWŚBmuWٷ'7B˼yM u'ꁡtI;=ΆM{Ki旄wk-ey*ӎ *$S#S!ZڅL s2B,Ӂ0> Lsiwn÷i KV4΋i싟IԢ7o"z{w^;8 H`{VD#me[6o@= :ѯڥoWM\uuv /D=HgRL>y iGͺ|]*VSR"-:%6_ǘ!<T T(|S3WYOϜy옋eDTL$8,3׊l/] DŽa"rɀ+uPAF%-*a*em(2'̅Ix+2}ftz:c&v0ᦝf3}؆se?B}uO/=8oV_F<}leM9Df!d42L Fȍq~YHX A /:3bArs$[r #/[6'ύB#HOB\m:%Aq>v׾Yz:xþ)Z]1ZdYr]˲lۜ7WSOl`&LV-Ө!&0fA7. 8[ј`m9ږm9PRdEmQKmxi r YLɜu)Fr-;}ߠaXՃy|ApYx9q֞,$iCɽ3P g XȪKWUQYƨJO*+'a$HRd$pV)+QTj9-;5.]? 0Ù&I$cm :1զ2]*%\qu.p9m.qY[kΗ۷ց]ApW5az[/>5e'SZ"o7pUKVsI7zOԖs tt.4$fF_F7rΖ'ejidC# '{ly|74MFrpMo{x@3yYuYmM>FcoMW0Wֻ|zsvh,6%\9`Ε+>RSjuPKg?%Eϛ%\oi {&B,GBiY`BIX-AK ui+Vɉ$383o7!(άճ2-VqQN>b:}x'y M|⑙ `h@)[M21jg7l<*Q1bKa䄬 &D%c@Q &d5ͭ^4FT̙4}m+қ%!y (qڟ=[+zac@;ԉ(KM, H$Fe0⩣xEG5yPH#jIIP0s\3L'|0 Iq<?T$F=57/\f<0@1}pX0fbLc8} 8G&,< o>پp(`Ԥj[v!>'Y /PE7&A6qg߻ľ?jvtfٚc.qҦ_QJɅ:E0~n'{>tSf6&8f.-Eǔ:"tQR 'fD) m̢Oދh7^-l[8 6 K=q#寗gݿs#fz^IIяdk"n!{zǖ^++KOsX. .8d 0yۜrs\"ԓ:W -ts Bli?'r@VAy0+{Ti _t`;mrh4󎣈2KlhOWiWii6]t6r#ʥӤ.v|PwzE3di99ƣLAkMR fXKnj;tj<[6Rc$a(ItOCt@ QcxSYaXqQKI{ cE/)d -jp9<x1V5,DN-9W:le?zY6V~$4i;.iD$:ՀL&㠏XY6_>1 (4!x(Ucy `bV؈mo(1Fgͥ~D2g"7n ciQ`2&ȍ,[G_o;6"݈Ce)_ʨ}bǏE)=̠s/"$]ȴ09Q,eDe4.fiR8!p Ȏl%}ּ| @BbZ @SXYPMYhKⰛj|rnF $z3z@VjCPU+vńJQks[!T`2yH ])6nVnCsjOYrA'E&r43_&R"sϤjkj9ajg Me].|7z%낌;{].q5]v/z@rklmlq$\J*r%J&n@IJtA(H쯺F2+Zb(XQR) *d1P.'MY31`kqǹZVf#=H+s/%^;Bv@"(wʞ,Xb&;K/r2unx VsUj DY(!] %H"p$E!fQ}\m:ako+8X?Gx4(:fvA 4x#3d!2$)G+L0֖1Ȣ,g GI2ED.z&ĀV%HL,i+UeXm:5D:^\nk=YKԋU֋nԋ^<> m+#^|8}X;- *DuQv약3co@: q7F?Rsя[(boXI[ 4;lIl07ߪ6 ,?` !(e ] lV%B wJymR1 `Rޕu#_fȓx/"dL `1Rn[,ߏuKrڀe],u|U$x39:D,>C,I28H[o*j) 7퍜R\jtaw;eFVD]!j/J71+ N4񠓁MMNP&%ȚK应w0h|qӺC2,="u,QD2V=r Ttpn CJX? n+-Cq(>jٍqmJ^*gaOt<&Q)j刋DOFTnIz>[X!K*@2C, ɘ}ȑr5Z Vϡ@MEܩm[AWBTˠY֘=A)n) S9gڔ'4UićFj$`? gmn\$C c?b}h(gka:ϫŝ6R i<3"6'}BA>ς+G5xﳏ Gۃ8<>;׿wᆵ?~}{ovoau8a#M/"{p~͛vޢii9>{=e]h@m/ma\V0p뮽cV^ZMx AZ!G# ?RT)/XI٥N׽<`vΓl;IoYLOjhnC Gꠇ <(OFX+,4]a?X ZaҍS^4t;YJ LdϘ G<J.(Fb:(NI9?Ak-'V@Ak q]h,P,XD "zptY =D\w=G٥/v3O0sN3xC{ = ` {6-y"K5ԠaώFSt)p 5V8(h4-c&h%h$hY$h<ڸ¥i,D9ٚ;ft)W e."`- BC7k9@uʦ݀m؞[Iʁ[|l!ALU`17RMn׈~nZ/94u!d㕄/AI .w}@dA"Ȥ5IIRr{E&$ ]pZk9ȬN|} C^giOjTōœws^ _JA>NΆ I{nJ JNԢ ʼnP 椕N`I_BNd)mꃏ7kϜ]7Ǘօy"} /4ʆv㟋R'F2QiSM7A"\l=ЏW@?^h{H)!S0}0Zj>kI``ԕJ+,ou rds"C6$[ hY3dɩM97,ܯ#H|Qm{ W}=W`xԍ<Ɓk=:jQ,Hc?P G4hȸ)eh; JS 2[dtd`;i]by>b%( 6 .ͦC #NSCԢhhpvGM~-riK_2m^6A h΄o٩6 a(hbTPD,&*hJ$ )*Noa?nHS" KU2ZMT5Aj&L[G竊*_ePK/!&Q2Z[EY(et)DE&Kq$%mcĒrWF^Ik< lGhNd*B'z)zzmfQ'7cmS`":}xCbS T6*zrނmmg} !02vtfz2zFKJE+ڵ)%:Dc]ܺAV +>ap&ݵw]bϹEӥz3+ V l(A.H.[; K>OƁB3VL$ݧ|^X YTc#CDG"E+1xϴIFk `fr$n?고NN]ɀ f6軆PH^1IBDw0`a$&] 1c!UJT5%b<>k_0Kekwip! ׌:Q9>ǼefU57P. ^iX0Td|\ovTEnXH>\Rzeyl"eEk(xh ׍:d_ l0: F;}mv*z!00FcK ).jqObжX\jQZbx#Ɛ+Lgfj^|&${ L0\ĴO2@ @#/&Dsv9\bRuIm e] "*Ҏ)$LmaxLqLvI(e :- X rj9.T݈*AI+X QTW:աDcJ} y,K4КmN?eڳR1M`QIRTZ[fq/z;?X@Kz0#>A@ |V9t,2 cucHuMm0Zk1ҘWWv1NaS>̏66t8U5]KmfYs0-tӭ2h&x ѹb0`خMwFŨպ[Cu58EyHYmhƸz~}45 38Xh+{r).2P@y!U@%AO p'rk=c5X_Ya]IQ!|?=_oa+@1\ƒ <@b94R~&xyư2apaB;BTZ1<**I`bYꥁz]B@2=B mX)+g).XZ#e̍LmGjuRc |,H>9^DXH#\5%i\4 Z`5# E]_ƛZ#@K9L!=X@a},gfFXV2Za6!VQX)#"8=7kV;M>rI{Z-:_M/cdaVM% D.P,wR@0[`4bMRC,?u+z3 3 P GQzQ;:̮ٚn61[!/Ԃq*F o:r>Zd">PA7xqQsuJ-ԝͮj{`T!Pg22Z{LJ SWbG s<%+\KT9'%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ RUy.9&%GǣGt'P=ړ T! @"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H *<'qH T 6h@\E T+`f@_H(R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)\%p Ǥr}a0V E\d|%~őZLE [LuK 3H_ǃoԓG FP6AK$6jxf g7 e5>[SM+hdV&9'|=AVkLGk q5ui5ూbӀM=@քRA# M##FYEcaM '1߀yLJA}mq7G &$'7@7Q Hu,9u$PjwFӒрFV(C#{|`A zEYrO on ?s4tPwG_ke=K<#Ys<!9}&xNOokxQ*Z(a[dOV+3t n%ARk.aҦU6a/?Cfw#P`O`.q%eL6gK!YTLVX .Fb+]spM-o܁PŪ,pĶ> ok}zOV /N w=ڞ4]㶻|{outs*|gR\,$ u( 9jnqU~=\h.yA(Yxl3Q%&JÙ[Avƹ?1|5m7|?%}7~acG|yWU \Z);?tO;kez|՛w~ V{mn_J \|JXUJ6FɖY6A@f{[oJr+;᫯zWƬͩc_eq|W@D2Y/>߯fpKӹ;evﰥGVvկuz^X|?g;P2u_n|vy^Q{ݵ ?A紌ޭZqxm ԮO Ql>u8jHw~|_\=]WjEa*!eeO7SO}m gtvk%0NTI"H.5yu2wQ6kE:L6,Z2i̺Z]G? ƸhMS~SAOJ']E ۢAXUak!Xobz4~s}u<_sEpfp9rձ^mwoT_{\ I܏B1jJ]!,R"*&je-Ƹ@6oj S29t-wTH}vd7˳ӛЉ|(ˠ2\Mr]4,`p.B[;~>6?ގGÿͪm[=n 6CHy B^=|C5 Y V7> b>|Y3͵ ݺ]{i*w Ec=2~aTlP5u;zvCzY84zuX̲c;UJWXF$osDIMpC[[W2TN}۩ *KӲlJG KSNx&,ƒz]kt}ܻ]oܽFׁI ^1gD}z1[q]c^[.mj)7rf̚SwӉ-9fQdY#rzVHYzSlu'u 2TE*SpDuIn:\gpo=7`OYe꽣_taQ-#;x1'b[QVfg<9i2>k.*-JML'mZDlэ/7ޥ68d46$%qKϐ[J]]R*L{< D+l*˶hSRrd$nIr+x; 5]OV?iNp|÷ͦ!oMDxNThZ7VeX1\zK :96 T&'\+}1'K^쳪g]g:u`9KGX-]J-Wtċ2}jg3%f |,Z=xTh,q+lRnc.:KQqs 1֫4 ۖ6e:fΞ{ActInbN1c_h4je[D`V;vu/$-~ ?yS)>Z3ipYW3zyzr+!Ll#Om@sU# IιP} %|۱Ͷ_!,ec3r;?Xc|j:BP'b 0~Ԫ+zf2f n ɿþaCΛ,.EwTʶ}W7pvtr^.ʟ<`rq@Ǘk?;͝RBw–÷=M>^'+¤vWWeïatV4mpwn-AV`ϯ^K[tF4mZ?{Ƒ_!n~VW7,,8aS"߯zsղh{l$4z, v͹+v$Ꝏ鿐UٝFݾ,GoO9{2 lVzn = JD7!P܇I&[sW[U5vCwGmK"]^BᧈtYJ6冯h7P:*#dfCyx YT&cIf|_o.j)ywnT&ϝ\2Xp JdEY(Qj3*}҃KV$1D+mIs>sxwMpy ZN ]X. \@ mq׹]p5uW[B'.P'Z[ t [2J= Y#JbL KYo#@ޏt|>qpf?Fz|aˬw[fLl.qZ7X9yό#N-kZ (o)# +GG3~VQ%2i=qW= M;u,7 D s `)a" %>H͘C$餆{}nTq+bH//bҵY@<.x:^ٷ-" S[&FDxQXeYDTJrkfu6*BP`'<-.ϟ(_F\ݬNӽ')]QW/f$5,rípT&ˌ7x &)f\0,'O+jn%Ѓ7d/u[Ηq&;|qhLW_]^2LsJxEM:kXv)2N8Rk0 OX&!5YS2dn:=.w OK}X@9$R3MAc(By$,RI`x~)= ?R{/1 `BXFĔe`􊨟0NX ijUSb2Iaþ# J#w6AN&+=DP" 2hA YApTuW&#56րҪ!@>IBzgIr+#O UGpHM% U#}Tfue>V\RV~CE'(d [j9.>s/㘳Lꎹı;|]i{7SN[TP|+l4,hdF& 顏}?gDqIKi^}:(}n^[API25x`GS81Ȥ(A6(cP,hTw*Q-*{ޫ44]WΣO tF &}p<9tFqxFQB4zQnuÈIŘ2R+󃋫 F61;9ؖ[%:NdOPq5q *(Mkr N*w(o: 9wNR-fkYB @h2("| rH*XEù!H!ԅ5B"ipY0Ŭ8칓d$H(T5!hJGv("?\ -K>%IX挘$:Ln` "(kH̖lx.={#j* KcؔEtoֽjr6oM4#֊ #9Uk#ލ.U<'i1},ƋE"I2c^~J2hW0k?OZK&;гtГLqEϜ:H@q0L#73?:/HmVb)5AZ҆$OǗM(UFWHt-8oӓtѶn:?e#7CVVVO+x$zxqv~|WD%b2? Fєk_1`zY}%tתTK9ʼzrxqя'߷^_.NWxsS9{3x'J{7$Dj/ͩnw(ƺf9t5x47-1bLY+h\M'W녞_9>/9`g]d׬]jY<6_"nv 9ɯ<*c~G[!(,ǣ0>?%r?}훿|mן۷o_өZE #ߟ' po~c-Ԛ[O-Q힚3UW c^.krǼهf֠2F?^z2/c#DGnsjU<.JW8g\n: Dww,V )urg춓Գ+E!xPB@.'peJsY:^@ vҧv8EU۸;_F\KhԪ!teBw$7VUCw.'xO2JMڶ:o>2 1$tѭQ[ W:hS,1+OJ[d3K6dklu$Ō劋q@ 'ϚO19IZ"*N@"ʌZ N. ]ZJڳ'M&?zd΍p =X\}ݮD{gU;z| *9_vGϟ$3ʻ=1%}(5~" JHHA `KnFmۈ Cc\Ij,aJ3I8HǹV.-[2Uj,4e! YenˢwX>,yjy/4n2}/\bk kq)eP+)V2q#ykR EeV sQ" (bS LP YQfC&JҐFM1W.&!x$:ufΆ[6F}3M)Wx.q.׈[ qsQ+FjLঘPP@B%dc-pEm)g ƙpE%P )DYg-6FF`ft6=]ezՁ-|"bwD(N!)UbL#ATSRj9:>2G^<^<}8}V= =xR*pv7f+-qDQ4d=Hp=DGکcW:ϧ8c1~hi%V ,g^IڞJPҊăH=Hǣ%=O{|VاEq6 >qsrBIk6"Ts†}!$.[M\}s붑&=K/-lE˵4[-hb(oJwk(w!H^}A>;iS>җl,M]5+_gO֒h{D&v(,VÛy]dǎ)X]uQcN3V2<0^{:͘X;8=9䃞uTJ 9B=P%RADc ـ:BiEZtˎ'>ja:>i]:e~!Ԉ+ͼ!m!̆Gahuタ5/~^Q/a+In: ǿ_ao /S3w[w{7zqʟ VQ;k"_.ҧ"| @١J"X<Ɂ"SrE`i"R⊅l= ZЛ"R<C $Jռn6"P` ҄,mcU=p? oqR& IK_lgb,c_\`wn=TxPYї>:Z+}Y eA%A gR;M)V#|IMƝ=>:6ݼ^h{˯mD. i]w?|ʃ'E?}FLJ<M_x0vuB;BMX`nzȍ{z^xe'U>.|\i4v1u߼__k{ہ6 x։ژ!r_OKL)0˵edAgZp0SvaP'vC.F Gd1Q+y*"ă8c"uL M.g9e.b𖀟*A9C;\HƉO]WLl8{jצg;3.H҄ƶЦ݅?b3fU\JHQR| ,3G@cHXYt1\F2=&ߩ`xo3":99ddru!"HJVUѤPX@: Hn{a9'adz!\Q$d(sK ZiG9*fCX\q3NFg7-4w& aZ[T@ʷS{Lr$V,jD!Cc+%:<Wx 쾙<8P=eA5H;.kf7[=qls'rI |)Q#bD6ތHQ" d`(yb(dZ*;@SA\Լ>Xg}leb[RR)d1}"Y&}zB0OeOJyI !G-}S"bj2_w=W<&\ "ww7!($h'O<.]p^ڹ/n OT+WSOh3ycL_-ڬ$/ο ķ&GkRjBFiq~I$B@t{P 1tvBhY"ޕ@F^l^D;Ά[Yo@ɼF_{e(}ڦhl-vEoYȲpvW-fitTܔQI̸J熡Њ 9qmۖmyT(KF)*,IcdI:Ј ;Y.&Rp-{۲8lUmnVOj ._μ{%i4:)\K0&.Uǫe3rbn, z^vKovYkStL{ ,ӅԞezXy.;&gV, s (dHч /3EȒ/; :ԧIz:t6 2)xb裁PbNΑő}f.01i= b?]|,_988Nn~~͐6Zv6gӋ46(bzGSS^|s9)@4LZ"C&%znj5aq>kOϣml;L/cY. Y.;olF`.UCfPa?7Jyp7h_q•?L)O,^D4at jcIa WhQ%BOz|.^ 꼾iXqO3 N#(yt7m#P?汵wxf颊ӻ)72ZƾLnJ6oz&nq7iɎt7Wv .Žڳ5o}k{rf5Sn|5kw]{hICK RkFlz{8dz޷ tH-dպ5&=ܾ* osg͓B/Z_ܺ)KK6-AKs5U}[\qڼ/ʧ`,CCYxڜKDw>u XzNJ0E12=X^g~P>;Fs>V]^XL 3T .JCv! RMIz:l,iae>;3`i6s:9?0J>X~b>x3_ێڿWlﯚNchrtPElAku&%@t>IEET:oAb(^#E#Ե\ƽb Bզl1!9%9AA$͑dP$V Uɣ=Ϯ#8pN{.]yǫ=<]K;nzr%Šhr*"BHh18q`@3ίR>ZUOyޚH""Ps.J ŵ28Ȥv<`4O2gRy -0A~WRй ZS'^C͠νODBїd-H$K}gR)Ha݄-2/8T0kPaW^BW^6z]X޺XD / <̤jc6)yh5Ɇd)d=FPXJo6ws~L;aiӯtZ`piȽ˭x$ m%,[ dCn^'4=:x`GdtOy*Kt K*8}uo^pI/?>G8o vՏ0=gZȍ2/Ik"4`y]!NCSx4ՌVWya42lK-HVEs4>O͎61jZ&vMv-CWj[햕 o3hCmӇ.h\ptMRgW]X/KZiu^lVڒ:g]y+[U֮,- =ooBޅ,`%m Y֗@~xmt'fg&'lsk_FEb:>M3'w$L.8JP7Gf{V/>1IuHmEצ1z'n/dTL뉻2hģy15CQdc8'0e#V ch(θOBHeX LwcA_C?JfwW04qcHiт‚g]ArWf/WIAUQzC>niqq֗rxcr7/_zԊSxu4DJHiIHRظBT`N֨CAƣޣLQO3l^0s0g3leE~GZkڍ:wCe@69qƂSQ ELc+u>&IO5cI$PN@CKy{Mvk׎ig,: )e _ivvZ=ާ{ G7>l˻*⦡\d)9YULj0cR , Azf$.޶rv]C&=IC@2 .G|BXQĔeڢ@ORrx -L,%|ZՠVXL;i6+>bpg#)SJѧ`5R@g-RJKL&y3zPUGNW{tͫM{{A6۴P>T:)S 5Ak&M;K+6ܖdEp2%AꝈm)$_5U3lcE%eH),:IL!i`li՜sOCGgxj@c67zuRCvSY:s<-&&5蕦}e{uiJ $!ap6J&->qL%^lrq<ΛO~ (q i"4ܐ3(퇺4??_i>̙HQ: Ge,sD Ql, ~Pu1ﯤסpw?ʨZ|vz2yA(.*hbY)|e L Dfxs6fi,& Ȗl$}x.&'QT'M,,mQ4Kr}0Nɮ!'!իy֫fTZHt[c$EF Y^hӉ[be(*s{fM, Փ FMd2W28Lt+pTmX5c=RMV]u!VօzЅOepcqQqua̻'B&x5B c¤R d(&Qs%td5:h`EAHQԦ4x ɶcf6A9'9ifb]qTNJ2O"$);` 7EvN<Ӛ9cEqYitšGÁ HMt}|*@FuBWʹ9֨_(/E#A#hhZemJؤ.0̘Ȍ+M r4=SF]]5k8cAr5 ͚,"\>H:i2Ȓ! |9kď'ړ^鬳U֋vЋ^>𬣏Je>6#l@ f֎5P-?X9Sžj}kf'Pad]fnȭwz gZQ.[8 ޏ_p,dQq-E+߲2Җi]^` M)Q/S鷪t.dKழ<[Yz2%Zמ)А(Ax8MS!}tBR BdQā8TIGBXbx&{5ֆKqM`6cYn.hyTDMHUIɆXz%HfxV8 2?bM\̋_ݭm^:wR P'dF =h88 T9GHHهgaqǖqӴPD^n՝1{d,R$Q,pAT.13_Nð5}i#KȄ6d"G^ܓiI-ZBBFT$`[InE X$P-\z͒3(ҒY D J^XzM.LGwo?_4chmRo#1UEGrUNb4H30:O9'$?O)X^1E9{ϏO3Lol\giw'>4iy9>n(2Lp%gÁө\O6gMI>GP-`iJغj0Bw+$ &dcQ4]B~_fMCVffU{&zpqzvx#b Gp1ޢ.>4~4SoW+(rэ1-_/kc{L7W;zC̩4d9^bEO_fr'iɚΞ@iW7ͻ,HcZE3%l:\Oxqഐ vAv5V$rz+OPʣr|h<ǖ^S7'll<]> ˻_WGo^ sϯ޽yEh)pEʽI^܃Ϻ׿nߵյ]Kwj·˚Cs!5ha$ѯ^My(^{;֣-5*U4WlZ.7aDKORZ_)p[bC]nκ˷&e'+xv6"Iu΃9m 6rkAqY:^%DE*7Gl=jɠ9U2fZiC1ОZr:zi^"8u*g5Nr')5s\w*-]wc\U֋gr47o܄&SuR'rn1|CK4,ӿ;d-Eidv5ڥwY\9{&c (!n+<hpT\ǨX\7*O5\yKe݅3p!k -XྤY0! .G7QҵyB=Jĵk]w7W4so.<+j՝ž7|%X[]r(5)өud25FX(҂QRPy&ho ZOV{pO_iK[i r4%'+ u)k KFĠ&P`HW'ͻۼɴ|i;i]X9be>yTO11fU.GN p\N) (+$i4<27nvù̀7WGNjŴWbwR+͹w9;}|}nlGSao}^v5 e T%/]3V/L 8ȏS;m>Wxqv-kz~s'⥻nz|+zOf%h1'hE) `Vi@ST gb >[|szzsn`|%1a.2얷M}ڬlpӑOK؃۴MViɠg=֤DZ;,uh> fUrUf(hg׸/4.$ǴJ^>Zj9)q}DQ2qa%>*E};9O+C~EZ3%i.6]U vQ _hX=vk>Z 맣,.Χ}A^PQ9`ѻݬ7FoH]lH%\Y6~o?rcSs9,>˹nVY J8W6YOxLfU6zX*厸YY5ͬR;bnJWcZ&pX٧].xYdQ=k H `p*YN&": |tC>!ݐgG7|t|tC>!ݐnG7 |tC>!ݐnG7 |tC>!ݐnG7iwsSEnQCb'5Dh AR3(8<[,iىyZR.F"YQ:Mz,1sH/I[? S*t$ VYf6Kr(]feetVy^б9[D%nY w) k9l!/r"F#$L+c{ܒlS fWvSl6Sb !ETQ3UP< SA"+Uh@2H許6~EUjE̞#[dקb5'}wYgd]-W"C׷;=(8k͵==UO|;{[I {|S&1cҢfVi9T(RV: /tV'"#:[R*1zd KEKГɑ3lyYɕu@( Rp#c; IS6BXxT,Q}ϕ?<\|-t5]OdyXW Z$ Z+:K0D(_mNtj셤D*ljU>¶p(<bۙkp#v-dl+x*jƨm;jwDquhb Ҋ;,S1`fN: Q j[23Bff(h2$fIr-1XLb*& b:aÕQ_+0 "6SAD#{BJ_mR[#-(U4$!R%9eu ]I!ƙ Y˒U*AƌؒV`ufhi͆~#r():IqDc\;.9IL1SbW"ST! ;Y|HRv\<. 6CO#@MZ^/yPAne= G?N`x{5^Wx{5^W1t{5~ʣWx{5^Wx{5N1QL-jo盽o۫jo۫j݄ikJ{i04h &W$ 3 8-kݢyNTtX#r !f!$1|M+uA}%:AmSvNJg,:f ?Q;pI'*7QڔU.N1j8$FQvlsU&෫66Ǹ_4p }T^)p`+#NC+?P309tӻ#H7z/jl\oGy4؊[wHm:rrCU˜WiZa|gy]yvvWW'Y~:;m wߎlhL;xb8~.th|HfWFnukC_l;z!wL7?T zն  jx _=J _oҴpAճ &lvgvY 9˯h;x7\̿Lj "3uш>×o j.Cqp9!jBYq ݋^<h ZIb ."_DBu)@ S{TN LHlM2+stmN<YyGҙn_awm7_ﵲh kr>[fsjbp&bA`p>EN%C>d e~&(Shλr/Xoނ+>dkˤ `,c@2:] s/$D꒺++{{^5wQT0So6;`Tuh}ކ2_LWu  ]j@怤BahIC\ =yÉ&oJii JbYe)Rbf^: Y6R86tzTpQf4IȨI , ITd@o uip]; ^}ܺn=ϽR.&aiN&.ңnzko@p|K%1_U=_ I~;~[,[V<&!'ii(B'?(iwb?b>;b='L;2 ꁕEQ) K褃%4s6*`"t:b^eՔ-}+I+V)1'twon&'xog}?ptr<ހd}W)YlJRC҆d Znu..`ְNtD8 "TYC)Vm$#--c E#h&h3)΍1~;H ~Y2Uu:ݍlVل LƓB,M@E?BB:"V'eJvi<Շ(tyoC*UHdеB)Qy/@-t*3Ѣc`sJZg$yXzgi25r^i$T9̒¹H0Jjt%KC0 c*iXl{S&0޴=+lZ)a0X"u1󭅨L* DhG1 m6$bX7h2缾L |ė^:ה;y ~[_婙^Xb]o}xA!BjDFbޘ 䕓G46JXbFLsOm3 zv!B!'W,X6j-IHJf6 ! 3Ys JVi^H,ب U@gkHq \q9[gYg+L>E~t @0FwH'Vi)E94h9X$O]Z">_YJvILJ0XJ9k/[`p̿Tj5Zt5vRnX@ '4hrY*EjBM6U,`&qR[{dI[{Yrk樂gQ eZ{CĒ\GBt YW褯+ PydFݸ}^ƒ^5*m|&XqhM|XD%R,:+Ė_5zRFhMg ㆌ 4>VTpN˺RH 7VMnO>ft4eE$j6 Rb`Mz{q>vQ(JJ:iєdOC JCQ﨡(FN(ʓCV)Yd%kb%s6M7SXgW9 3!a Mߘb$m2ʐA.::O .jm2̈W%d,ɋiOMOj1;Ŀ4TbLTd(TӅX ZLRSR&:&ކ2 2k'b!ɢ ?laci6* ue9%eQ"mE.J FS=.CŃ  T@l᫜Ej֕ʲT]bw/82j%J銉 !]~=6yVJcI&Yؐ5i ^:kBjMar~+z' ȩm}=auk{;^1gOm_JO,v8lV9SwU`Z\NK7OHq(*HrN`z *?x,=Z,]'ɞRAqEA6xclA!h65$ * lRlA'+m#I 31Fc{4Fes,jRj۳/HY"-mId{Y uN,r0Q:@>O-*{ޫvRvlr]9]>2k[hٻ|vZ^|Kf7? %qǦ~9_;M} o٧&'&̟{~ ӶʆQ{jTirHk8̺X+rɭ3N]f)}ITI{cd9)暒oeWLߤ;TM*0D^r)vƸd[M B9%'d[-y-y4'ajsB;e-eЍMTӜ#_//;_cfb#Hªh  Q:cM.x*#v7i*[6\4s$|-,S$NUd87ⲎmFmϿ/\mS^,lirfyrN ldKW&Edlu]u$BQ >UhBn-7-)8.!lgKEI(QRUTDеRk䐎[7_ iFkJV{?yLW_^ )Xòf+bObR򊅠VmccX-X<cFuſ)s_iyw:Uൟ~W o?~t MLSRd:&t> $և#d[`ͻaezf*CT" OLUi9W%J/%grµ d2xlԺRkG/C)91'1Hj՝/spU%Zm91L Uu6B6(3ɩXS!qjd"4؇KjJݿwS:yrU*yq1GҩH=|WX-&AV,BOZrۧb/6իWK@JJ>:02&^K)6>MsQP47\z=ZZܗ8E2D:J۔+mȕ6 zv״x9{b1[yۙL|>x Jbf2'"jqV6wb.7/ugW7&2{W' ,(6*n }*6m`=~\q{\^`vzdڦѯNk;`vmwkNȝ4ڪ=efqtz[ ׂtJtEcKk4,'MSa:?L1U5N'Z Vbi=%\='eorK٣Xqri.*h/pI{~~Z $-"&ʜ&%)`j[.@k[4Dlo鰝/^ˏI xc6>CUw}<w\Hks@7۔ڟiVڮYnaĈD6en=C>g}Y }qu&o7/^cpK>V;{`-܆v]qA c_'_p_~vu7{n}kyp;6v 17/~n ܳa43};뀱'ޣҨY~C:B( k)V}i8-ZmoʪmfyB?#Ԋ"_}bHuf#@V|2YSgidMxtzg] )kwKey5_!?^~˓-繼߈KlUe-pVym)h$ Il7VL׽߷*r$ jNB8gPF1fVy%U't~L(A)+=WZ ^ N)GJ zDMji9 ՘-jLQiR GYUS 4:I("5$LK9#l83*e%dtcRDG%Q|)2 벳D8`La!`:D%Y:"lk+LBҰ&aB1VL"3"8}L 2pP5I`6^"D8u0b(=[^)g! <'iMV:aQ2D6H!٢yx\"p|A6J҉+Y9q8' RIb1ۢEAY 3'pH}dS)쓶<73ST+,IZ'͋I=7E8J{+kCˬ1Š<*t&լ*(`\$(xu (+v@)$`|.KVA 5>\DnxHLvmR(@t[ 8gsh[R#oݤfD)l`F*eAcPy2Up ӆoU[1a=d442TΎTg1y* .+pA +CRf-a5FN2PJ7.Wd%A1*JiLXA1x7zB܂ޤj\,U"#IPdGdPZh`Ǖ$ÔNQ{R{HE $i}lՀ|I1#B0.@Ơ) y4b(VH(,`@&:PIJDyTUoQ9<@Qa᳈#vR;‡p@\0 BKp`R0 3Cu!Vr Vt'Jܞ OPg4BQ"!\͑F)8T$w"=vV~[.PS Niӳ+E'h2ZFjRŤ<$J,x— WL1bQWSk)`2Hj1v HYeN5%bxA.w+ 餚:K4,X#i%7-A(JӶ޲({1E8]G 7)-Y !) @wiٲe&Ɛ% aDžum+[Bռ+G~v5_m̵i³FP7*nJD3 B!]˿( "E4YǨMZ9>HUDʪѠfcL 3/go9)Uҷ5p78%<%2` rh9Čl>:3FlsN,NT@=@e!%L "AS'+RC!652lZ4 *dOj Y[a$+)!* <(@bGAʟt׻+ϘV0 .cۡPѢ XȀG52Ugf<9X:P ȣtnFD2a(^Gp `NLQYݺ~(hH5i*E\[ae+uqILja"L5%m} ZGvr\V$kP|PmNmdc (RQC`mqa :X*:L:Z1Ee=JVm_H0""Q rR9EY:㽱,gI#VJrMD8l٧pj,d{0"0y7WĚBmuh{]s ^‚Xi˿roouGwߝ&Y]w^Rߟ} o] 1,)T!!Uh? VVFJQ+2ft:;B\uu@+6;l.vm_xY![w3XdaJYǹ -]aY&y%Jjise op3Y]s3 dk9{nް3- zҽn0 5:L*Z!jGws}S:㺣hdUoHq3{}~kPY@-b` lJUVC1$g²l,\rf8luo#Vi?8/]c1!`npq^6tRsFmU¹fWS6t_^-m)} {ހ-ΧYC'bm ٖ~WC7{r4ݜ;o4^%M>߲`8Ҫ{.\HVE>,~17Z;P;WI{;x}jEJ؃eZ0j`IY jހ<85^; lQͳg_olN 7LqT5.ˋ0߲&6M|:_&z{>q ŭ)U{ү8nw^I>DYjHs9 قRN- IjȘm7EƻT-Cȷ$ߒ|-.VP|.IfFByeyպ*l*Z) )9S2oIe?x/W[8Eo?us;2$?{~KƑAсl0њ $gTG$L2.JK淐|W'.㓖?ymLSdŴo)n;1<8J #.ğqr:G ZpFחB.?x%6(T# v.ȻN|Oc&hV1^`e}>h){rX6{ ҇qh'\22@V i|'ȇ}M hC+)Z!.: e-:<htqh)wE\>6Tv{D~ؒ]ogYRlUyx /g>n0nWmm\Wcp{-IW<: dOz:0-Bm>]te@fw֘m:vo(CHݖS7ԻdN{eb2kR6z;9oϼ = 答!z;hSC< ; 2ߓYp=R2**nEjnkq˿h-SD,G6hdkV>Jè@Dn (]w;[ufN."Dp|xq1R GepZ䦮[ï3ƌ.Bi#^EU(2N,MFKSBQxtLx=E0tJ;%~{|rJjX\M;?p/&ONBC'vWL 6gSxoVc,$TL)IF]7008fi[Kbml1{-s`0p F,6647;<(ns<9_&0K>]<{ݟ[;v@re%kO0Q 9UUC2[J\JNza^\FK0jqzhF@Exu{)ZW Qxc&xgP!ZL{vfi -0~n{S"A[g9GLd@reU)c xSxT! \HJւ7&uReiZCzQ4g7\.@8*pn;‡HÇcB{cCFHٲ1z* jɇ4]|M:jIx c}g%-'bՀVE]yAG| *SffA=I͘uEiWs!cxvz|ՍAؗ3Ж/`yzNp</p$7 J.pw 8^ƞWXEIլ$&F|D] AoA{`yY\_>r-#7n,#ÖoDw؟|9;zt@Ǫ/ Jga01SȠu47 ıqYkLJ:5:B1!K?qt<{) ?-A=ڏq}E?,̹(SJ +4n*lb̰"d0zXdhq*K&൷5W,>x5cy#> J_:oz2z49\Ze\AttS;g+筯Bմ$]=~+vփgZl3IqA܌Uv*xJ2M-^ ]_ֶ¥̲"xG62=i`{r,`LzvnB.] Fа`,R:EtLpuYiYK i3pXDUX4)]{h)`X$ yG`g^zBjƏ\^ tPjeR3 }?W;R)OIr* 4+OgR!U×q30dQ,bI M""=3ʨ=(\{LLD~ZyO{:hR3 y/[kljW OIIkm-D+AOcrkoߣL!Ӵ >tJX ®.äB8q;xSFxy1RUҶVD@`, 4To.rفouj!^\RVP>SS S"yaR5Y[fqQ'Cc %.%GY * I&YG|z75$0d@&)D1)\ ]}_=(%h)QFj*O0A,t2|O鄢%ˋE^]  -fLdt1gkuPSEv.EaJ2ۧ&&P!gY* dt>\pQJ]L.B{d~cYVCۧCsY[lN$s9z=n0%!O6(CL{Bfz`nR;K6toVNU&\!LgV] c}%4hp}FW5Z4p4ܽ&st}O^7g/-׉ fku=G:S28↜a3.8dG%>4]tFJ( m8הRB^X٤|YJRE|3BYohBIE.JYv/ؑS6pQn#Tǟҵd1K4.ZܯD>..׿"@nɏgQxY41djއ}I|ELHBs@2LKxlT=lupPW\/<OBՔWH1?h~x/ψ_~:黓N.xߑ,E/&o/"`ӇXkho94}hΠS/& rY;}(>WyXVe՗Fyn2]h&ytu=&1hh} U!0n@N,wv0Oօn? ڳBOB#D;L`p=Al&t1$\~k*Blɡ9kd LBC_1@OZrN"ERF_שsPe~WRg՝#/K6XէMeZOݹ?,7lJ^N6_>}aŸ'%OjsIAӟd?H<,%[0EP0HnԏVeH$kPK)2z(#IP(Tल$L`mXDCanS18-2df1Z̑KN<Ǖc5r=/&I8~}tkTVaYdRem %f .0E;9) Le$/^1g[$V%.$ن6>+^>D h!48aD* j5rP=tRoBг'L~85ɮ@:7f6<X_}ݡ"ē{gU;:|{t=~wG|c }(HۋX(Za  Q6Ⱥc%zR0(,QJ3 N\ˤjkjl׌J5]X3vՅ.^U(*6*ȸ ` W70N'~B|:5&8d:. 9H)ĵ@?XȕtJ#34gEM: x:onjlR|Me]ۏGq<[;vڦֽ죛$re)+G@R|ȃ%(i\:3D> (!2@^tIP$apL%9,S]9aԗ|yLE1FM{hBL[em@%d&0u"3@9hF]V:pPƉIʢ r- G2.٬pc hKIJDpI(zqzPa58};v݃ +w"Eq*pz3F>/lڊ`G>%?vH& GF2Xl1| ۓ`{7%+NwrGvE;^d(!Hƌ2,;J)σ6AAp *1"q4+Zs숒G*T-OeFcD ܗtS^Bfʹ<[9Q}-t@v'|Gg^ζ7߮k6>{w~~˾ڈbi[獠$Ͽ~'8/T_/F(-dsEZ0*!5g:ύ٨. 696siK5dCMٛ-ӘgZ%B(Q&Q`H kT6l&ymYO; bTqڝro}ro7i 9x)\ MU}lZq6|8Z4!(ԐDX+' !3,2 2oS+kCPzGK6 BiPR0s))BmUtdѯhOCU_G-d#}{DxnvݨEڍ:z2B,BX2Xz#1JK`,s|%SڶpX(Q c :VyvmVRg]]Muz,/o6LАNjfo3Oz!%NfNiN9/@gp,y`, ?s^A5@ܷ7z##Zjhv(u:.JSO5AQIK.>mA)s>A3\ $li]B,9(5:W%9i 灁&D,}@^FFFT HHf_f-r^jemnJ[bWyRmX th,L-ږjx}_/Y(j{t:lH{CGܝhWvԐ8MKF@Z6FJp #ܔ- . t ٻݧV|+7>+Wٺ0*|'fhƶGZ8Jeo3,"b ˑ0֊İ-y oo_JoW+i}X6ayi <:35h:4_<7:-:vhMj"&F ?w40p|N] N:`# -lɳoRe54u>rCco Ș͚8Dw?m$_rvҏH쮴{'Ćru[=~`0dg<]S_UWWe\u6<,vM9 . w%sȟ͊;ެJgq'~(7,%<_.ʹqnݏ_߾vzKW;'4 :f01Ǩ:.,$U|nx29x6 FGIH[~>o׷;/7v$Hg/>O;o+W(u&;K_۳a9V.:o+Z#Д| xGߔQ. 'm\r[Z,} <9T62d d#,r8MS!AO } )0EY gl1 \'s,nZ S%(CC&y2o ~nO LSyƛ-n֚DzZ{섌⒛ԝNdOČ4[{Ϟk:F5CЏKp/19b|& N&=_;%11 M`0Fv5pp~0`&K BBhPmq*-C=7[mC&˸u"@n_&ah/efӧdBxJMCofg>zۆ=`epv14.9t_<LV]  ?9nWbri [75kGuHkA~efyOp(2ŒDO9r]?&׍^5s>,ƧϿF66}2!{eYt^tIQ\(O5X8= bwo~z߿w0{of/f8-Lw"`ژ\7o>ZCx󡥁 jl7˚f͹Ş.lk+@R~~0OB2|6p֦`̏s9DJN^&1?&'0DKտ.RX8 1!nTKWo+fyǸGZE'&+$4"I d lh"(1KKC u'=taVzncmwbsr(l*3gӨG pŠF/ku*:[O<^y/bW*eǖOtO4K 0O/? F?=a,xR @M*U\nY=%poE=eH$kPK V;^#ʈxE|i!= &D >ӎ h0xmJ>P,F3/5\JgGV%VeoѦ`oÃS.> ˊ/ *hS,1+Ow)ܙIie*#y9=[Ja²ő$ Цk˫Dx-',=hBZȹYP-@г'l\mWOn}wiy<>kókZ/<"ē&9|%gnR&s^Ӕ)}eZoLO%qIh%%!K0`LBf2F mU͈FyCcb\IT'e.E38"rssfUjq[]+BtEގO 2^uzylfu  Fwtx`Ɇ1NY鄎K)DR 2q-ygtݕ@P=+ B6%6L36 ,@sЦƮFۏq[;nMe;!k7I˔SV&_ wʎF0F6oa~elxBZncOΞ3ʈ's(@ER :`pAp 亡Ų+TTKe݃cD ܗZ^ق]tZ /jmjs`[ƛ qͨ뚞@')kF>}ߪ''/R/C[dl͹Ycu"-3AsT۶C< j!QliL3e!DYڨuP u( $ ݬhgɱk8N'>~~:{Cpzڪ2 #AL}-?ya)!|7 &rM)ЀqJ^I#/]{S .? 릡hqKL@TVRM@0+y@ST Fo3_I8;o9ݿT7O:@۹0(]HiS< R3P\:b>b.ƇdQJɛQkt܂-i.!.$$8e2aޕ XIћR;8 %*dUov!Zz\ 蜶ŬO՗:Y:i̓^fhɬ#S#/@gp,y`, ?sNVUO" .BqWuwN#9]Z2 }F֚Qj8Wpv*:Sj2d\[FK6>T6>t6>X6(Rv'2hƝ\D-](M2żFǕ xBݺ;'<0$"OKȝB64sl{kuEQuz97)m>-_zlxz)[FءGBG57tnHSL,HNS$q%ZQ# - iY# 8bnJecCFwAFEsDFZ|6 PVzC }k (b 3EN>bb {t^dbj4i[-H߽Yt&Sh58j §`\2"ֵ௥݀ ll;8H20xHk塼'_l-ahHN6^ gBӫdU܅9 EjHHY{gw6`GyTBݒ\Cb2<IBiPes))Bbȹ9 E:q|D|xٽI ",w"I#F RA!`<ijLr'UwޞIxv7mxvMSn) % |7dI"&A{6N%Y(O"~5zP8W6YR^SxLe'9`溒o]ɷ--\{ c I$Υu&}"\21^[if+wСM9Ud1s/X"8PV’,EvZlW: &x|x4-υFKaq>)Lk0/x`O  ` ALW "Znas. ~h1 5͓"ʳj|&~:V&x@w ZYՍO9q< ;uk=;P޵+"ioj~ f`e08e'A@n۝8}|irm9$ Ex*~U,V%)1MHӠ$WEBzPp\,YTLg #Tc2v݇h;_^+z{[ɬgdByIdbh2FyCP10.Q1NJilyLƩhD4Y4BbS PΌ)8M+9!%{t`H$528#C_/1~SwP,gKqHn:$Ś]gr-GYw`dPӲ׉ZfV҂He(@K?# #Ip81mfCCOuYon*jNPYF_"Ǿ,s5NhD+ƄIzgS A!2 O :a8A?_Y' ʃIDJTshit0> V'Iޟv,kBOᕥ7)SōIˆyW-ޢx>ҫfsLӪ!o.EK8jHR4(LWϺIyM;w/|teFPGW6a;w(%_)g=!;-%i#ɺl[6u{zhhKmYqX9؝ e97{ԷM qFihVIC‰Y*έyF#p]fڦ9hxex)qn6.G\ 37ݾ\Ԑ!MkB}<4$ FfR1_-|I '7qQb0RX]]gX٨ƺ\߼UEe\{ԁ߫7hMO-r:@^Sh"J* oE!N[.,׫M?a6vݏr"nSm@ c6BOխr8ߵboCgMy?R,v9sxG|\30ul6]tW"mB7m(f,*_:q>6HloQzfv?Zǧ|綟|~4O/ʛP{Dl$H"J2CM¿*)uґrɈ&1'6Ar{܋KЛr6R.FE &9H%iFD9%"=0451kLv$9g3 KG- %i1ĨR>rv,guUr$fHʇ{Q󝊯 K]D@8-JrI$sh+5 QC#gpJIʉkDlDO;R{M`>#!J%jr? HN̤qѝPΠ bP7`' vZx/4ـ DeҀv "e h9H)D|"I qQG{QG{TG{kz'Apm[@ESY |)$1Eԕ2h8#<Шxcp~K E>ݳUutRT #Sp[\je G( ^(GXfwu+g0?~7VpDbjҿqikJ$r* a63؄v 6W?ؤ ٭Ͼ磶=൑e* A2Q &BQzYCQ$PCVL;@(ݤAkD].c3A ߴ=SKälƉ ;\F"6'A[%AON"kmzIV. %xÁ䃷|y &I4dxB#p{H3[RV;g1uB%V>ճ֝ He` F.'DMRn:}Xd$VKQ1hpr(yL-Eeҳ+ּTc>*%~{ȹ"q45(]8ФQ omP@ZJO5BoQ8))tw5&¿OB},~t}+4pUI-}6~>M[E;|vl3>>X<*` >} jQ{ IuJHBJX LA,'l>ypv6-2DpzP t'iIkS `@z{ȩ3v#r2;Xg=9LAdN 摠ٽ f11DQxGC%;!S.~0 m]TuseGn!~yTMW,I>8$bsM#M$#ȈTK%?Bby, 1gU*ku V3&F^2Lj͘ UwnoyZ@2F}uo8A[MQ٘|)R-֖rh16AҴ ,?LQq;.ƅz?V]҄P`U/ba\Wi)M/W_otq7TêV]eܽE}*yWʷ_4Ӣ<|gy?.j?P/EQqZ$zݍYW]gyUQ-ϋp()zʨ(]́ %* ^:Aː|N多 yr׫,SF:ZF*R)Ko$hX{4`NZef}rǍc+ )9(heօT.`BV+{m X-rxbRkNMYM-pl*g_+RI!H*KA,md3Jp2 !zeRh4ivD0 M=%!SB: < ø:YYE٭Zohgy*ﲝ$ޑ&G S|um: (*B9f{"QQoNuM8/TwI|d&_;W2y6T#jjG絕-x) ҖG_A5q) ,\]\ezl_Li(eӂXcFKq]HEO .0 /]>w+1L 1/(e:Qu2Vzs j{N!#*@A)gf>A ֢y=#h}hKۥ̮їׄtӜ[]}J nϳ1n:A<9[b *{7qbsQޜF .+u܅ TB)*"%),<~;=yR/o[`]LK}Yl,'_̊yj@02A|P T=. ;Jc>gpAa o!]}$U7ƫ^Tf:u %;*3i4ү$buCmsu=O{o0?]h-7XmnֳҊӕOJM(WT7cw)"s+(*_ԁr%}.>F/xeݻrsF&ĝBAtȃ\5q4+Ŭԕ+|Χ2Q`q.+Z)YkC y:>#6.«9q洅z֚~_5cgՓzv2(:&F_l/a@JڤϷcZ::?d]ΕB*+`! gC~j~#^t*[hW;2DYeX"e%H3(D= v4; 6/s|^'ٹ֟&|{&^ւ| >zJK٩Eܿn*j_jEi )EO}^qecvy>7x!d+q朋z/d|EJ{nr  2f'P@9i)3Joc|_qvp:m!@ۛ0dR dǔzeӦ>Uoټ@v[u}ʀ)Hу2eT;$b"Υ=!ss|7 s;h\L:CIb)^-ӫ,׫Ҧhm CWzJ8> VF26"j<նDW(^)B w\Q}TąFE$!WJnz!0ܺ$yMΘ)^m0s'vcQ4p[@!mr:s")Yפ7YZ掣":#nA,vmg7xaK~&6&F#ek8rlb, ǫWVgM\EפtX)R> lPKi"Þdw$}'DxDw$-FWTq<`:5>$^’R&V= KdqBRc_Ukz1^жDI"9G-nyW*TFXc < e2},b, ?sv}npwJ\:㉝4]!"[Gne;b M7^"ud F<*P\-rf4Rr?P *LW( H.W/];Bv`){I q#%)Ф%\C=AsdРeS8z/<0$"%&JΚ9*M#D:bH3rޟN#i)3.m1 vxu(g/$?vqBO=:z%tD HQv/GF-|]E2|lz 'h}bev Ÿ77 \:hwöGZ3ҜpC4'vB:5OzLJ%/+W3a/л^ Eƺ@"yʘw9(5KB$$'ɓFwL8'S–„2L@bRrdyw$GC:ڃjwL7 |f 3s 4: ~-hgL8C-jIE5KKכn^ڀP% 6yL >@xRv}%Uj*{ݞ4`I8 :@bŠ)nz˴L/2%B,ci9I9 qos9CVO-> U^ :&%,,hK)c+L%gNz 1팜A_I:2!{y6`搢sdq <$Y&` ?+* KO~dlkp4/<)Mux0φYXPm:ݸ!}XC=EҾ|z7pJuRdyUdJJ噕sZGo"*J`'`-/EA<5>(X{fX>5ҧ42[/8/ALY+6H!lMhMEMX+Rvf=@uluަ8*[v f\b^Sl[;1KT屑ٗdd*~y-j:ݹb{i(ZeHETq4k1>lȣWY--]]kMA עPWM}1kO]fhI#򇭺htEri;'&tYO^qkim7_Xbg0L̷zdfmxL#ɟxݽ`7tՂV=e?kzxL6cϿۭc{^N ߰!J~Fټw2fᲡ}^)"Wdj.mͩ/{?#"hQ,BBI &h:dP8Vɉ$ <Ȓk c9tw?/>t8뛛8P]1<[ћuBR GfֆLz^#D#^~dW b8bKa):\6$c i("YL}]GqvcxI^ ]y~-<2]Cw;hfru%xfc48D$ede͂6'a0Bf\O7g2[\|4^7>P1u.Q8ǁi$:Hxhѡ'B^tZ@ pSA7Rй RWo^Qk>fҀ٣?ZJf4o%sҜAw&z/ º -d-80QE][v#|uO{";K]("A wO-21e1NK||? lr;(ѝ0(d8Р-{> sEY\6\w!Ln75|Fs5:ӏW0TOWS?!j_˸*R<հ<#1q!^+0|C̵ǛepwHIX(1ӽӧO;'px{6ڝ^mf zk7?QP 9̖LeFMSZnʦ;/ -%m8K9Iu`eJ e}.fNE{c7`uZ|kUN.`ȶ撀eXO;,7V-?oaYa[.vX~wE:W\\o*}dsi~ߊm$DBWI`2 r2!$&9+2SD.)Ac&&-f1 !@va2 9Ӊ+}~Y1C^uP<^/Li^h=71:HiT))f}P9*ɹF2JjS^ |U]T7xnN/‹URƟJR-:ϕ&EFcEt-PU1T*@XQh@(&dj΋>]ְK&ee [bK#$-)]=382CY2\9qSƘ-wGL t 1JΨm29LtkƜF$8>P6+| 3rE^}Mk[ny:_F(oBG3] J[3Zp KmD@yJ' {oq8+B$s!HY.U%oA؋@"JQӯ$"J A1%vF"')U< =t4y:ZaqIt/4(ܻH !g6nXA  I6̲G'^uG#|w?ߋ 1-*A'JJ dd!;B]h]y*-/z,,Kh1f~V ל۬pY/EǠ#2d L4T|+bU_I2Ӓ v). ^|Fd59oz3t_g\MÜ0'NWR7̭GlBSܽ.2KZJ<}j(>j$ + 之AQ%<2hiN{R)B2 S Ρf%FYS0> VnWim<}ȜG rF 5i,農/^ԡ4Mr'{/ɻz/J߷&Gg.1a`^Ӆ{9 iʖ li+ CJ<H^R]xN V\b:;=t1b61` "[8b6L}eѕ%rg_vJ@`ȋ&\|#ż1aX`&$\2A˺GCQUQcIs@R3, tnm3t}[8_6%f}XןzI6S,2-F)QZa0/a@ЛgqIN'ǹvs$(p*z]escPQKOL#[;sVl:]è͵_^1hI$ވ-Uvzw;$F(o!zH$\%S6;|gٱhY2DC*O'xE$9W6/HX mĄ`tLD=ٍ4!gY_Ѧ$C|ڋV7h)Yd@i,pRzsXEEWsv [RcF'a9X5px6Sq;KgiRzD}pCά;Y*pmr^bɋԴ$yDQj AV)YT?@; - ȥVHPIc3b2Ҧ"'ܩm#e}|S58C2; %Mg=HKa*"6H3vT8wHf6_- 9vbs;æ? W1*񦹚[E3.,,MJܼɺ*u>?65t6Lw+hۺ'Ka{N.ͪTp'ܒ2P2Lp%'hI'~>Oa|s2_8Ic׮%!A+m_NebOBtohBFmgu]d,fv>5Bu.4$Yf}$ztqqyz+" -7 y99;EM)zghi m~✶4_̛ΘVw7ۋyjU+?tZxS;wPFo:sߏ&y<:;_έtHmբBz}'7X_Kj{K}͈fh1zM,#bL;E3,q:߬'zhഐ2d_}c5ԝ΋?y+"L,v%$rS,7 =9|-AK5o˗&M]9׿}˷o?_ra߾o[uL3H8uȃ:`zM;]ijoݴxӜAQ&|v9v?fۛv;v7{k@Ri~z͗Qw0ԣ|5I+U\?,wr9鈫Wܭν؉Kb? P~w-ߙG]f$x#__HR fV[.06\"(nt $&ƐAO;ۑDPS3o3ZiK_1Ё9QyA:C7b|)P0T99h I ΗqaŇg \w>ְԄ\KƏ83o~ ><@Kh Z5$R fP: _i,cq< {~,HH$g>p[ FkHB LB.H / &@"ł VYXf.% eCpJ!:}ITХNeX;;D#r+_1^^9n%>rJ_LХ,1'EH\)ܙIe*xżP7h+3N#.;!}V|E@ &1B*i-p6Vfj!/٬ތ`)@YxWpXbvәg{ $Z-_鲰;┞ϗN׉~SNk-oyRq/2$(^qg߃Di0%Xub1̴q[墬[V[s*O, ѓVNg MB͵+TmX;KzX,FBSYA>,mqUqO.<O֏h%26[8de\J9H)č>_X6+)]UWVfi(ʞGMM4M&ݎY Zwa:[* ZԱԶ` ϗ)LA+/b;eO,A1 HOҋiͼuBVYiciȄ 9e2$kPIب 8|ER9TS?lcX$$ ^@njSED4fBdƕ&bN0X+=H[YΘ6@q"]eh@"gh!餹H#E=Zug4牖:\+:&[g5*S., H`xdsL$ϘUVD(tA)x46uTQp8u]8KRRY%!ԃ.K3dHy(116f$( (.})tLŸtz҃oDO\i4/#g7jaS k&81?tĬxcU: Ẳ %$Qz}1qN!2aYk")JHYPYy|-{MlhKI`X"eJaVA$; \Pټbxw1fD> $+Gq]Lp`.ڀys=Z7Wn_ëR,^Φo/߿>c y׀D5HAPRL>K4NЙ EJL"ӑY3s?L!>`k?Z}K. hi;|\ݚ/Ÿ'Vn_ :FgC=79%kv"9iS>Pb :.I^899=T7!@;0Iqy6}|^OK̠3l˾cs= %ptqL)J+e%xǓ t\!s qst]'K-}__-_7c|HxZvlӤY}e=S4wId={Fٷ˵mn2ٗ_?x➆1ⷲ of{ҝgG9)҄zyR|{Yw׉7$}˴$GVvP@=IUuB],}#o'b0hm x b[*zR1xEWDBxHVr6ɭK"q.:w^VCd~~ܖx@&3B*$Aaj^;]Yo#Ir+<ڨf x a6yHtS$ wdxEIXɬʈ/"`$SU[w |}o;sЋsiS\E[}u5}͟ýWPmx*.8-WǭZemSN*ܿAj +)wąJ@Pp1ch*s1[FxgwA ~LZ͍RTxź9':G "^!фܣ˭ἱ) &.W<(XZ h81uqxl qPA7߰&DPuIL脟 I&̏$Ҕ.qv+YQ܋6wαf ͆rqזV728;E.V0J>gJʻe*`vqd"UAdѸ Ȩ9eākՎP׮V;BHIbr5*W<9+Bݘ;%*$J%E e[{SpC'I6"E4qZD=As*xQfn`tkRt`N#i%3.k\1 vxuh>$r+$أ7BGhq1tԵvط#Ezd"di>Ls}h}be[̖Qx~s;zH덐r~Ez PL%p*E [/л X"zXtZ40 )g$( .߭`L%+nZ^Zd\8b8 os}@K7Qq9hof`ki:0l-b`.dvlGfX-}D~5>{\9CL4AFҭkH2_1ZSQEPg׺ D6:beӜCdPqw*tY/{yn4_c˙yUtfUnOZg_^*[{nqwp7-5,!*<ƒ-iHqo}͵D$m-3ueyOm2>})<= GW8~==I7c^.qmn>lo,7nI~O'[_=]n]xL'/Lnqb;O|8k=]l\wf'~66F1WH|:_(Cb6i|OV&Y *!i<ԻjxpZ1t-6b;(g\.41;*Y!$xE e08i^h޹'4jcB$)e`L*p02zcCNX]bG^^B_l 'C/t%ǞPExݘf?79fߜcNl#6D?';YyӇAQ,p4IAz7_G9!m-ჟ>lkxFs7٤OwC?XO)Otz`4﫸*Tx4SǛ:J1*bZHMܻI%HJLIĨ3O9;Mɍ[avFӳ.#u-.O]B4wNgWJ78r<h0MO p" e[.kGO68nq7/LRNiyAt:d}m&1{O.Vl6eztZOj)h13%|\T>EL)KBB"4牁x;4@A>"m b 8Gs[âth!X#P)  ,$)N& dMv9Чjo~kxcM9ҝe(=^mWx]ԂuT'o+ͩx4 `2*$@^4a( |%h*d0/5l~&>{[YYVȟIK{ {5TKy O<c{ (W&Eɠ $0,P-s!VM%WFk!hA9JL)@89߻ 9/ЬCohڎ[QB[|БpOVhg杯⫙>}[qoʼnBS}~y|)n&tNHNBK$"JJ[fIR')hCH[Lro.㶘r6T.F"Ohz#" ^"34>`'Z#T6̑f)K(2rFZ$J}gIAL21jDNa7Y1r:Y-W_oH.+RNK\6 m >QC#gz)%)W9+zeC/%E;7R g!UI#DYA * LM$KZŠ_{;6{G|Ƞ^HC @ OTFoV =,ARK)D`HR(8nhOu:K]XֵMXqArtR0Gk lqUR׃ ù:ÿ"/AË7Nc[;8]o6Cu=;18cn(a&G~.RmfTEEƩ-j)*o s2Z&;iqD{.omW:sz(J?rf%@TGSz" ܅#M Z0̓Ax£e80 ^[Vv+Ƕp5>n+oP+t|7|}3k4a06Zw_L>M'-"~6]0a# *`s L0([-T :HVC7TulqH= LSA,t><<0ǍI+Y(I!A6Ab3DI<{>3ڱfFݭf,VMj)`Lim*i~sTvGcOZ\Yr>]iL6U&aZdS-Ƹ;yD-BY%'Kb=((!դ#S, xaKFRUidFL-X=Wc0" VEk\#+`Di w*# v=䥩pĂd EoRp'\-AZI&1t`/[/@٢mFm}w>m=>Ecǯ8zȳt8QNWzSI08gv.&X[ v(ڕ?El1YpPLu:U 1Uxz,qQxAJʹd <1\MဇN`qϴzcCCLT&3&9[lR6gȚ# 7h%b4~qaN//`jjIY.Zk"%cTdwVJ (PX&lA܇2Tw^^i~ C.Z$(-/Z)Ohjjpl]ZC'O/߼`%-IzShV[]'zӵy5TH߭YKѭ>-okz?/aKܲ5p&@ \T\r>e l,Y) #~dV ]|Z2)0ٝ"ɥ `q(莈xDF/d/7'6*YOj[dVa$TKNhS_--7.0@ )˖Tm8sL V0%utyXѣg0qv3)L[>Dz[V'GgӲCUYf~#wfع,: _)=0MJ_, t'F-7D<V+֌'7J:N[?wJ-uȓ՝:]׸R,.Ӌ(?Ag?~7Zf,\sU='琜|C|ۤI^)ߗ7Od}cuS<:樄"{ͬ58CDo3g3izZNfejp{1;-뎯v<ۥ;jB/ǯ~# @;ۆ}8P IuLIq4я2KR^j넒GBܚyRmwyLr |ڗxoma.i6mo^ w O-sGBr0'j/Vg!"c/e?rIwGڵG2YY,VBw 6*k4E N&yHUO-;v}?oرw;xK3oߌw7w7+݊k pڶ-Zu"ly@c(ʐ 5{adn\rCKnşO<4_g2({WCt71fe%E.jtFf%GOE+]Y&6]dЄ[uV(^-V$)܃'جˤ^yɯC۷xkk*h|ԗ%tYf >EaնMJTwb0.Ceb*hH!lM@ۙDl1c gp!A=.# O;Y)Ӑ. g]HR%mY}G|񻥁D6fp%A6JҊ+!R18' TI 2cEމ ^ƠLFn'QmnH<"6j=7z3$CKJ,IԃŤSҎDɄ1$%"lJ;5TmOR@rC@hep{0]0*(H풣SH*,e]X46Xg=)$ln0(_`x`- ZSCsť ێu8Q䅶%JbVxp܆9 1QT c`mVs}B;uP4"Lq+V3`P6b^wvHb+ ZIմSXI(# nB˽h$I?$+sꋐAVai*"L. :XYT2%Vja-z]|%k#!;8L0`r% aZVn ܫo1Cێ+@Vhu%Km.+IT$%-#KUE1* [UJ,d8ƒ  qq ` JDlJ)8xR` Xge^ DOB3i_%~!9+% ()7(s(e^b՘V`bБ )VbW߃Z]0J@/Kw`^V*K[-d5 .VC`@j<a$:j~P+WpՕbsӄ~C~ck=lr VnF1/EUPN6_ */o ?p M;L>ŒbT鷷.,+}XumU28)Ztٻ&m,WTNMIM^a'Y R[R;h%uS웤b&/p/= *[$4t>p1'mÎsXe@dƂB %A e(|"QzLԴ Qy(}0BSPcDJ6+ |mH=)  M` t+|x^[2h`G}ƂՙN,TGy?GXŸSy  l#d^E/$9b' 7Cv{P6VweYIB[zfHX/M=ztLQ: LNyH! Un]H_ƕdT^H@ʀv19J6&ZJR"v   a:U(d&@ Y:Ō@PXYR ALUJ΃,;,DҤ pd(5B~2*_Ӧ2L dA +.T(4C6E.|taB`aMni ]T(:Z"MWuh&Cdsb<9r@7l*N'yVezCyEH!ޱQ8/Ӯ[vu]l~9"$h ?fWq Xp$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W\W0X\\OFpk>zW_ (G+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE ,˅=% dWֲ\JK\9$ HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$Tp`}P`oaZћZ]bS0-~vd&o@wu-6,Ol,E|LKFd,SΦGOGë.obMkfڿ~3@d(3p+"! P^"z1)/zг5dP}7<{0.?Qz[f%6*sc. Q֦}z{D?|~VxI^^yt?ǽ5`JTʆF׆*K ; .6Vàx5D |'҄/{/ˬgR!sF[ˁ5\Zmxz/MF7c372/ZG>z2 ֡7 .&cPg~@龽}7s1"@jȻniEeio _Q:C4QF$zT'}YCVndo›8yŢvׂ҅٥@E];Rÿrw# :VZ9H#/GCGK(k[P"h^Zzctڜ|s{żǒlEGd[1-XKE6dΞu<{m']tKY6ϫ:H:Tw{9TܫyhY;T.u6Ti.çs\K76{n>v HC6Ɲb|&Vw.#0uHLeEGA^>Q MnivWPnܝL [F}A^oٛP//2V G=+0~qe6rzsKRIx5BPy|kڥ'jAI4#Sq}Ȱez|\&rta<7y3OCm櫣}{v7ҝc8YIN>? ϶*x0F³;Ԑ[񉽼lyy/pKwZUQEW YDBU*aYKvLXޜ/fN&-_invwzX&['\n!1hvٺy Ek47Cva_b~=Pj(Ю|VɀkHMZ?Oʼ]ieW^cYhVWtkW=$K`v-p :r<[M옱VisNd6nvjY,γX)NMnd9'hY}D=aZNl.aɤnՔLo͉cǬ[ũ6Nd!U/sEO~nS}k0O˙e9&7siByu:H Š Xu!Bj/r۝~?O~[tӯѻmޓq1yht¿FUՓyNQ?^m.뭢5 |݆WRdL(IV|1S7/_+qmWyzmq׉e詺t0~|ti8㎚VlwBs[1w2ZLuÄ9S[9Yjr fqNѮ Ӝo:Ftr1{e:}`?_⸎\5?QUFF{5e~*kDc6 1;Y~j?Qݘtw=]3[!\\?{߼bV w` ̫;k ؊n0L"z^!iͅ6!5[ ~Qi.w50.XuT-{r=a)v7\妱jK9ʮ/ Ė.05=h= ,aL[ƀ*O@mWˎ\' @} ̉4]qWN*%{f~5iuZ6rY亰ZFNYtа)Bˊ a>c9޿y;ԛ)Y=Ѵ泉&(2/DFtBtۓdsW4#ʟ9e~#u|mћbqՊ׸Kc!45!ds!p{H$!8NF)l~5t'ْI@y{$fM l4KKWݬV֡{G 2tLY]ì=R~`'P8e.ozۧY8 .~Vt:D:5: Xȧ&nM>?pM_X_ >H$N߭-/e\ '|,{:y-_=>]\GkMx,KGuw~^͔wZeuCjee6 woJ;PVѧ%St@K[]ߤyٞw=mXGWM76>t^ iԗ-dyX붬싿6gO!VGFiWW^e2v6q#x`U8T,3cEQvsŇQܗ5sϭif` IcXjy7͙6=dGkц?x톭tjs#; r/ɍ|:7G~`_ĉu2lq/ȅx!`}O{Y@ʓ:vЇr!OAِtA:h$,ٝ=`~$صTcƥS^~6Z=UcrDtDxОilۍfu!UZsDO/)YL]Wht99!rOs;72t_@z~j qExoYFR(~,6o#9k:57o9϶6t]E}+E@b9 L{jQ0-%,wW: 0؋N%cM`Ŭ)cfƇd^R"9םe"ˊҩs%;f'yP͆3$K G#z1ӡs^t*aPCvcLCu;V̬pniB饛\XB/\ų5ՈB/fU:@Ų$Jn/${Oѭ )M]SXG0Q,~y-᭛%{8y#R}<͉5FZDR)1QyוaFz#;&`0/'خ"jas4 6O2-^,3hBI/fyyEH8JU8:"&FI>y, 'r|p4M͉T)-ƌ.JOQ&^,qt ięeh)GK5;|9C/0h?`J+2*Dc}4Vޛ覦}^ɂsAv蔩0\;-Ij^GRH-u8@qTUGaczqMb |pijhtvrbT 2nڋNӤ1"9׊(5hUa<7#dy~<[,kJl]8OAT𻷬ӧ TUJI+ǰlD⬪D3'2ZiK78z?x_qړhVX.o:S"JC2űƨcJ9фd\d\d0 E A~{S c+:}^=x6'_?'?},kBscXv@<Ē.G#=$wψDsq,Yi]rJ$ŲIW|58%FXq& nM<6+qf3O{;mgvUem԰;>ۭauOeSR0b\elɊb@[S9] YsZBSMC(8Byɻ'4K0k?Fx7 9m{$OA+@Z9*:A@X[f [Gbo矦٪>~zq|Y.T1;dc @MqF>>|^wTo< D5lnyo?Njr;1AͿ0;lzXF]gv?ju{} ߯ܖNksΟ~Շ#K XuK??a$<1[JD?к(J;XLIS$F]pFk$``ENC$q2M ^?&1iee,2p MGoΩNjF۷.-L]w2L$V˂Y^ & 5JZQi΃|`4_yU B F^z?lOu:VӦVQ`wKFs:`K7&'0I:g8DzJyEm'5XOEsDrt?Jk꣼Fq\~`&0J$ċY(hdWJ@`B,Т]͚HL1IT_2) #--ƻj1z7/Ў''[. VP1de̱..]Gx$r1M=4Lzy}pn,]B Æc.fU'ѽ$.2 %>ĶQ~*Fla9agv֛QN`0VPǗḉߐRRB77^J*ɜ0N^4Q\;'xLDgƤ-uym{>"asH S1lt6#HPrRzQaJJ-B0Ehhg [nR2r&e-il4j)-4hM@gdd΄U0e+wRHz&σ4}x P S/h58ݏVV "A,h>u1I>Hvj|ul2hT\,| as;+# j][nW>:$VfIoAq^@:@P#ǵ"ed*̗EQBP:T{,)qWƮ^-!fStz>X]@؞so !E6phHz嶜"Omq]CKJ)GSIN(KO|WբT&C9Li"7iף=~Rnp,_ \$4ZR`hv:2*9y_zY}$>crAO#Ѓ݇mFH~SZ Ax9?( 3Bu4)ii@U|ħ ލzB7-.CAUG~ZM]O&HJq^ cAݙ#ڨ..՗09a_D!R(oe ŵ(rtCo櫠qW'gr*'K\&&#+4=\b>W-u ]-o Bs;D6"mP.6.h 07:p gr=0L. CI"s5;o3hqz~[P#4)I])TwGxy) 2*Us{<鼯^AR[>B-,1PG-nTکD%jrK/)Я&Dj6I)Qitܐdz0Jv|tw]ͧ岕j>$eQ ),Ϧas̺*ݺ"qOC} HpSMʧ!8%{Axo@my޲>*e5_ :mD)!tx%4`k@a2̭Dz9$)@;q`;5{3|9ˋXgvI\jrzՋ_CT0(Н aIXW|q4A~a FX$:ė=0ҪQ?sVR)_xT&PQk ! #B IVQP٩y@||(L.h~u/cT2:ATU}Oqݚ9As.Df? rTo-y 0ݺ3 ֘:dVƽfwHz(;ۃ/las4 ED&+߼6+&V-T bK&Q11J@VI ;6IӁagF JqEG=a$d ISFJ #4X[4_?Tfz8ԕzX'RZGc{q|'?YsKp /WbC3ޕ3(tRSXqUIc75̝-]޹bhC'KTK^B123!*0L ` Eu:.k336]E$UvMz_1WMSDsg&FIh3k >&)iNK f`as!$ɨ. mg c/WU~4"ޢ>/KM^j|y2dž)rgo4Q RB/Ya)`WA>BDA>X{'*U$*S e(D0M#ĄVɘHd(p-=+~pcTVv2be7JDB̰=PrwBɰq;AVG1]BIp<*9i_FECCW聆$%4h+IS]Ҕcy1L@m4n *? XBʵ#8\)skv95S|t!L^+ٴ*V6H"c] +#%4ꪲ Rkg+Ui `bKU௭ ^_P|?x9jvkhLd2]oVvs{fcd0h wcy"zJVQ:A&|t8Ç:uG(*_pگ ΚH_}JnQn˛TE-#v›DVc0|@'ܳqcvHN۞kXd En!`^k,i"D~oږ$ô?{VNTvuOAʈYV-4CptEjTap˟}ݴĝM8ܺ5mnt`|G@l\ұ̻7lY5S҂YV\NPߩ8qTs5hvUÎ+jօo/r/繶%4%kcX֋U7nC =4Byy")̃a`ҁQܒHr7I˝+L[{BX-<+zȗ:DW-&ӹ ev8ASpUV7vױֵnMҏT+ӹnΧs%i+}rGБ%'U-9n4r|թ 9}F1z>b!4Aoo2rcg?|]sNkSjƃ<]YEվz!"͢ +;c¹pn#+ Xae.9I{/`ڈ^WBd ʿD|t^|?/IEV͕~֑ -2̮jo,g/i8sڪf´Wp:FY&TsFW˷7'} ;Q"2/D„/L뛐~"8OLuV0۶`"O~ C18r.ZRUAez#f,Q%ƛ;pŰ6E` (p9g@Tmef!e/~SDMR C0Fg9FCPl%lzĜ|z~x3p=A#%MM"S1TBHαBVw''mns+rѕQ̪Y`x.Y !٭Kfm\mnD`1\π,CA CKHD(|-ۚ~lS/_$86PV}`(2}ݸ}UvLEbT$:,D* !tʙRD:.[oS뱦!IXIǧơTa)Ln1`g,KmɏW/2 LhVLc-@'(pJ 4] ˫Yrn2SM". Ϝ}H Z* Nf^DV.mbP,oKṢrcf}GCki *w*4C#U2Q6TZ2-?p=N2xMx(p|I A BgepR˃Jp h.s.Ԭ'\77mɺh X򹽝E!}J%4יȉ~ l%4hkEWJKFHW ͅ%3kczS!=YO.Җ-춄@m4yMly%45vLkUhkٕ0*Q} E3*zۯönStyϡIf}i@C#a`5D-xm1X- aW&#FcXwbM񎔗- 9hjӗ-# S+-6,`/BDA##hme..#@bDҡA宝OYFtWYk.&%4|r4}#4$5Q"DEbU ĕ}c.*/1šFoPJ5h 8>hMQlXD0V]|'h2}\5*Ć}Qq=AXTuh)0KXI\T+^ B@⟬U'PqJD}!tjϼTݸ&z^a0Jhxf+f8t\`Zg -'3&ΒxiI(6AYZ~'=p㧇 \@Jhoq~5I9q>`PZ9+ո"îF@A![w(f5֕AMG DkegEҥ++TP6\Tk1 f6#aFUCb[[ߏr^cqA%4 D{݅a-TpW&qw]&S') hxS1#逰gAEkd0,pԠPikcݓ db6UA~zN8" im) >N"xlBH0ڗ?;ǝ"8D~J)t%r>k})-.47>#ppP>Wy(Z6?ޑJ5LTW_AS*h6HiSǃ0kTPV8z@AݑϤ퐺5VO^Vf*GO@ Qo\I^a(c%4Ǫg0a'eLY8B/A0YŐ)2_dJC-eû{/S. tNݍ 9 >T8>8tez@Reu=>@Kq:* )xWi^TxJ;'Bwİ2~Q BJ.7p2'äJKkݤ!)?#$t:an1uLOQܖ}JDsN@sZIv墋Af.;1ktfN^:SŰȂfoOSBcsj8=+=?ӸC j-[3-gtg/% W#dҌGX,wן֋Uzl7Vŋlk|OE67洏f/9Kp9h&T8s:$9$b9GDw6VsfXT5w 5z*7Vl[ I\pc:cl][oHr+FQ/ ;KY`&ه`1 )˒%[ұ}>Eٲ)n6u0XU׭}1}]Qi3)y[:BJH'KDb%eS=n*8S A6iu6{0 $`;A/{s0ks!{Ҹo>iДA99&ҁa.UFy?h留`ѫK[-_-_۩Rffy0p5pݩ$Wx2P*VR_d %k/ʯ{LP%@cTuƴ^٭snP p"JQ/5jwcKə`-K铖`hwDFϋ0c޾ѵy=YW:a.Ъ\h@rҚ᝘zv:7c4ǹ/yek w,a$#:n Dc_\_/1$$]~y(%2-O;}Y'5غwYa]QQ= 9t_//^;T1-Kjԑ$L)˅Y+?fz}` EnˀHi,FPzf,g!𢿯 דMN"XB@5cH7_7B6 5d/ p3tnj? Zĭ.@`ab[-ϳvUz#v]ϒ;ja ϝ  E'KpZ8R⃇[mrh&duح+SiH"@Q!wD/aœo(*_ !ʘ Ɩq+\6}&:Od `g A=RHM+ TD*mV[Ӷ7[^JML\wTi0f)j(װ1MBn0C>7O?CT9j-eJ3Frz>Y6Z+w&nϷ ?1xg~~.z5eaeLD]`^KI#[AxS^;KӅHCηv@iYfr : y6̻1]ef:1.0R5O_ 'MQI$0&ż/b1bnˀHi,Tdሖ3CdI=C{~i?P%R}y띦$[7F]& yvCJԤJXRp u.j'W`PEjui,ZM|\.l?^닛d\9Uު297r$AD2yXBh%3'\RMq.YJU܊O_uY0Ԩ9qg +RX˾ ̃넫Q9d7zJlꝌq2dkJ<2^U0%U裔G`?}a`*, dj)+Ʃa]߇KaK/woD#䘠rIW^&F}d58ޮl1}-UpN2圖b Y%14ҳ_$ 81\@Sx9'Ys5BU-`o¸/61ɰikes+v[jj۪A32fWBNԀKh.PFh1"zFsq5uWJ^5\˄$dkx^)i#S@(W'RiB|m,d±Z*(y^sIepVS8A W*֞̊UQBk:R_E-X\7}xXow`567(pȠ`>NbL~W=< /CkK#ϝGۀ H9^: ,ܛ;4 (W:-eJn굀1v2SRPp+; -XJ eh WR/NJd87~Abh.4&ܫe0)Ƈ퀅!TܛI)`U([REh"`KZZBcEUY_USFU*ŦT81di5!=ư4{[qň9ϱȁH#N˲*:k,4E-4{heJ) rR&`E#D~'+,zF@I0b-X4^KǩRNNqB͛kQ;L$7xLe޺AY 4*E'V85E4v@ȑBIUi<]U Es!Jր2\wPG0  buGVa U)Lcp6p.4W ͣ%>QD"Xo>W_z2ĉbe-6PT/+yvg\|a/# #4{etYy7`@ M1A q&$r'LpA/B-4WQ/W~8C{-:F5(t"Ps%yU":0YvB%FN̿H+YA-‡Ko:xBRJ%ԲahpsRHWԹA}G+G-Lv%oŸ:0'4{/u_bZ5\wM$qpi&g ,H۰]ك@$`eJA plY"WL@\ɕy:?xo /G>M79<$@hb~~$zCap lu"g 0ÌD-4jՉUQK` w.\.wBw0[Oltz `*k /$0gŗݫ2UfLn3`8O(84zӚ2<#}fv2k.l /tZa{X.k2ZO튩YCs|<{9FX0ڤ ו>;Tb#UYȹC. 4ژRJx^ZUasTiU6>eHXP5foi?E(Z?q6%^?*Κ"䴀6C*:fJ O1k|L0;'6Bx_z\\;jf(%qDV+o.poF7]q tC3ܬVh2eUX3q|_v)犘BO ǂe?,rd-QɬQЄ_.7?oɟsV?_?E4KXɼӻqOdE*TeU+F GJKBcGO>I1`ce!vmy} &H+c'x@ r*ĭ)6E1أ<(x\jTs$)Q+4d; 5g A0ɵqV!16\/Ѡń$w#{ˠIA a[t XPq)DeG1jQ.c^z!(/=D!d$w^tqo򢸃TMarf_|D$͊ ^et2WdqY?{֑ ᗙl[}Yۻ`Y3W6u#iKq$%JԹHb:IdΩ. LShƃP*DAc㣟|\Xч^8uGY X Q[Bǣs :tۑN+l0 j /hmOGК,ot{DD4&l{kZZ *-x\xG踐N&mh'͈hCo r?U m-}b W )G4BUa-fmٽNgjЏ.I4hثXkih4q#rG$zm@[h>4\() TBk#Hk~H :R7j_KYk>H[Raq i_5qmT wgpӎlƽ)}  Bf쁣L6k e_]/ p 8|B9y2Y^:]~LVpKgʘ`OuiȦVɀ'IEB? r'&3UR^#zm䚃Z@$ȵb+9VATkaވWCn5p8%;pbq)w7o~;PN5tV{dr"SIo8rdl}廢q^?8B=߿_5=Zk8!v)<n.Ġx3ڌ.W2ʛa'o'f:>8LPEۥ Kڲ[\^H:G1$(}:wN  2B oI&l0Sp%6gӬ}( &p3TS"\_0t8'֕K]Rͅ'1 xYǚv?:Cv=jXBw v`.zgRrcq(̕uy*>1G i0xM:b?9{|q]\O.l!eO")_e>kN)!Ñ1۩և`yl圵ul\/iߖ ,Xc83d c7y$|΋6#qz6|>(aߘkTsMwbV%.#hXבlԶQs0G6mݥxnoځ5GxdR<ĦܮxAogx9"p4H){EgOWp9xϕ /A:$.-dOQDSoI|s{€_/7W}ջb#JLw>pm}u~/ruR;$`p2`d)/-kVЗ} 6pp_ZxgJѳȥE 07imw3?m^*RiOY g7<' @E#Sp %(/"h3#8P%rGK?h1w%QHfctѻ|=#o>76Ph)>xLb]wѫaDD!ˋz듩ц +.h-)f ƄcdQڥ۟>e9%\K(DS7N|qA_`>PLJ}BͅxO ߡGZ0H5]Nx{dB3p471` %cLHUΊ}Iϔ4Ν,Зd`iuZP7සz;$3nL73_dQV+/ږh4 y=2 ơG,o`O. t䊃OY+Mbo)E`u}[:>4em;ćq>{f^No̰]e N P4 nAm 50m L RH;9[kX@y wtJhSkR+;θ2Q!="BM0-> Z(*][(!acc*ngVf'o6ry* O=XRpAf>x8%%EϕIÁD4˲+#~02dc>wpZy-QU9^BMIrHx??GќQ/e-$HJ!4G'U@iGF=\%ᦞ1S<]PnI*vJȾR'Ѕ]~>qL= Tݩ0= _VQxm0>JNWѦ}+.ٔx.'\}v^3}J8sufӾ o ?D9 K'CлzunWW"g=)EIR[hthSuzq+S o}AQyIE[_EE )7! e%>DrE'tߩ@)غg1T\I<͛x2C#uwa\)I4ѹ/ ZK;FuY:O쫲:ʴkU[r t?0l<QӖZ)ефF?~aÆ|keWJ+)ЄOښpE갖sk Grb'peA윓%yz&Hgϼhz6$5< Է ϯ;е:5NZoyܴ}*fjHV4.6BV}k;-G}xGϯv6x+r_&$nI;ϱ)}BOYcQZjZO8C1&A K0*]vN 'K⣟|\s?^LxQU2ʼ㜰ߖοF\$,(?4D8&.Tw0GҹO(|36TJ!;< ξNΣ gYRo+x*tgvM=&$fE9q87b ?nU_jfvoANj)2F$:ctvzr cߦnuzQQ ]X'r됯 SKdd} P}%Y{1= uԑo]DgPrW:i^#_mBGNoPqBCYޔ&i{0tXDp#9ECZ[)W6swe[qzǩO|1ؿiYt'|.nTߋX1i$w4pI O꺳h wVͱsPX歋41~&OOQQ ljMI u^[ gvi7Dr "ڐHo{%Ԫ`[m$[׶]OB5uja\;Ќv%OiY!ZY3Ijo1kͱ]Ja 5ֵAP;~v$7vfR@Nɍ\JIS4ٗ ao&ff$8x Py:BzEP>Λ8< [ ^]>gu~V+)K<@3;zX џ_o6 ~À?^ldpѭP oN M\QVS"4BsL:'@viPh_,Y ѥSDѭ[rH:—/B)X܌FT-xwDֺ+"R HĒ#Y/ w{^l4A`14PX% ~0_Ep ymzu]GLUP]1KLK Xu:Pv$#??p!]nyW I7ז^rpؓn+ꈧC_!RNB&+眂w}=}8 T ,5ާ=]r 0!>b#ņ0⦞y9xGϾ&(t\2l/ Y^6dX nWF^㺗eވoDgxpgDARi|ف (<쀈ߐ 4z%_ Yݲ_6̩e `nL+Y]>AwۨȔ6?~jijB,k^|' ִ˟_oK%䄃4o*??~Y˫uo\7s?ZKt7dsRrWݰ/@ quw|~_m$0?x)YZo2O#itUxS\Ji$'ھ`SjX3rќI'd 2%p":YDª-H"NQ/ϪoMʝoS$I8Z SX*vn>֠+Jl'a%v a>$Dݕح|S+NJqbio֖S`^1;[r +[ݗ/:t>_j _ ᩝ:祡@ ӍhckT1N6Whh8YR\.kRRT.bB/{a2ںЋu dK 3FoC2Rnו <[T~-gMonJoA{oOpXd> soZ4[O0#^.E:a*zŒ *ERltD3iV1JV(0WBH%^!:!{,] VUgae-.jMNć8At8mG>;V') QMKmnMggoaK.4%/j@bjN\l'Rk\fOugC5[6{`r3V|LhQX?p?KξwgBI5{0d`aưh!nZ]:d4"HUr:en XDoj͕)z' j7T%yi-0%+e^ ȵ uͧ(ƨYѫxח5gIc Cj>etH!G溻><ܧZcΫX;brH"D״j,1$@%u/܋8<9idLF `X2͈Ƿq w?˙fIjfBfKO|>y=B𑂿y7;wWϟ_^^]]3gzOS<<{뫗XU~UnKW1M"*r0̛>GφEF8!"dO~,D;w8t繛-x'CMC:ݚh[i<~OR^{:go\|~ύඇOXCƵr'X Ȟ~\%Nwm9ݜuOpRIfRW/>>ÐΆJM\ ۇ~fv3fwCDuY˽ޅ9N)ZGlffĆ█jVjإ,fg7 L)S"y>fQ6}fClx{_m](ˣfڥ7g^inye6lnl9a0ns4x--{ uG keg Tu]j**NzfCRl.]Wz_Cw#72TS.a)̢RkPٕ4pg!v-M!Kʪ`z̬֟&H8Ts- \@ن{n]ǫEؘw=UpįŽq5P 끃0{i/ѓ2@b^׻h5Ǥ̛8* Dd @zIk=8R@1fe cǧs )'''$h2aDI5X&{$1{UZahlc<T..1O'+>Գo8|{¹fjraoRj8;+.G}m~a8HbF%tpLW|̔ą@'4vi'lM-֙Y y,n:'٭nCF'&*)x:TF>S#d$Jk+&%Gnap@`XI{r"B%gBKglIV)UM"1\(2C%vZR ݰmdɪh>W%6h潚6OŦ)/u&>3$z-KoVhPzl~r|T lm| wKf$J Qgg s:@SmI%\s HPlwt0š dAFh%=arpa_mf!z02%R{0^1+M堃~ڸ;T8ө4Zn۫:uv7FW-x:o{wž)0|{wwR+:fM`]1zkJdDCJΣ^à ātZ#1" ZS]h*>e5 o|釆aôov-u,Gr w`#0zQ^K'^=>l}p%OKѩ C|xߛ 2 )oJ-#eAU4 ,3B$ژc=u-E!1ȕ%%iawo2j,K`4ы3n d##4| 24v^D-%0ߧp#8| h"`gq]uBIvjfmwJ8'nvY"cTjʶ(6>O.BRzoM+"Gŋ++U,f`@P 6*+p+d*@E yN}UsQ9hh"O,6a D`O T\*WE؆5b֥)$c8ej1]o5*w4+Qc-÷%^+j{5OXё( C[[VA#IaX'ۯROKشm`D:mʟI)3֢'>惊u JnA2E5Ǽj0%c)'l̹a5PEQ+{Mj5Em6*;1kR.$=FP L ZoԈTX><jbA4S=5BUϯ׈,DɊ̪8 {cl|H@$6 vTpS4Fԇx9THUx Bһۜ8.߼V#9gf)ejz?&_;f>v;cn0_ 7?Z_GW] 5-}-_/.7v;1v߬߿%\!JWXn^*Ɔ~sWReW⬞iR_l?ړ_6e]gzjC:Nyr׀:Zq曗䞣iFݚ{5)5 u93?!Ͷta <|9-X07;(6[4nx v T!ra&J@ ԞMQfZ%b $ ->}pg{OJ̖+r/G?SB/`Z05?zs|g|)nyS|wn"Wf9|2s.3c^][y<}aI_͵-_YpO{Ϲ9,M/''cM?cxvV[s޷XG؉=V[,>1L`1]Ҷdceԛ\_8Hf36|i˻CFPgԫֹPY _z%ЯGOm`JbD25נa'ךnئ ǴwPo&>hnv&жi|#GvuK3_`,tD'\SLMOBaVS6a||p= u=1w᮲0.v\w1f)_lh,TlP9=a&8PrdNgb73 "1mfuX ɫK"3&L\$&Rp![rŐP >h,C@SZ2">x;& vn}'NÙ9 ,ZvA"Lu!i뾏%\RMbrra``mor MP`޾'[ͭ\ gԦVZn9ճP>I; pUgkD++s5uG0`7NJ#sv ^+P6 ٗF]՘G!B? GUs>!9<(" +q3DfVga>dݒrAsO nLO. F =Z! g]V%74IZ#z:asKަH !\t*DTCY$=1gy>g21fI}*U&|)% Ss?˕Cr{ElY] SAR~(mRq&!*>^5C6$+,F3c--e>p ׵$¢ pmnԾ `<7/#xjȇQo(9w2OjnfO cP,=g`Fg)Bpᇅ0waZW bwj1G:a$9 #K`Ȝ=LѪ۩ZpHYw`ePˋnj Eb#ج L6ǘ)@Xc*lM XkHVJXOha|[4AT\&y`IܠҘep~ժQ}̶؜f[TȻ̱KNH'{hSHcOOƮ)j1OJfgmsXs &j vN%V8V ӈXEzyyӊ>(~D +ޖ>\WZ εx5G&R2 >QKi/"ՋB bH/>78hI-eK0FQڒ5(=aGk9έ鴭iZ+k> Vhne}lo.9 л9 ða/8Ƣ(\@\˽\˨WTɨ*OH&ODi;1'%6(9zC9zz=̽(:$b/}ދS`}auF,՘G`K΍D.P? !Ω;dj|OpQ/g_ܦQXt\-@!jS`jQ"C!tizZE >pՅUJjC ToMd?*e=?Zʂ3JEh*L#ȜmQLƎKo;:+ dTpD.ieU8s{ ߏd6\ɚtkܨ}rMO[E^_d^)B/ߞzVgųA+`ZCWk|ّmSlK~akSeiܚ[˽ pз]`Uv_h/QdHpBN{y~;د%G2V;d[csqr+ ;lf%GdEytaﵫ_αS.03~Ps_<I1Y1`{Ŋ}]uUGPa^U1p$|.'rCc 3^|9\`;S%^Q:Fdwś }|q QssLq!AXI>SUM ZIFOJb0 BbnN![r1 IRgŘGKY[ сڷןLU36 qlMvDY`8\vx`N8-QDՋA\Co9$kp8|s@!s+ccɥ&,qꩌ!Ht&QM|k]a.,K7}$AɅU}wra}C]p.`Lxtnq<2k6_x"sIF7p? %GUOq2w2\ΛF|[>JolJH?XUJ-2-)!Ks6 GG(թU#K)voéQ}lK׫1F_;_>(ѭ(AJ$T| ogZq_$#{F6m-ǿ~T*UeYʒF҇nD2A<>x]pMpi*\9m<+'9DGnzqwyA^p@yZ<bύ\mx1hvo4$EcXNq^ZǙ $bB501VXq3j0(E떄0“0\%摆 l, ܛV%qn: :A`O!EhR|p kf{|6`D1Fq1>k vS )&_ĭEcVڡI02a= 9  qRQ4Z-i#Y_ h ;;֭myWy{ishN4ϹZRGHf-eo|2BS dƉ:ϵ>tS(DYhŰ&Hl*c"hB,7_Xl,NFn>넣3X8[Z~b.sW}>c:M@ "š^DEQe Šmzg#^bjJ%6)PbXHs3䓪]Tit\9.n;jӮLX\+U8v/yR`㨓(h!DIZtDBJ0}bt'-%'g0>dש?{74|1 qbm<Ѣ%YoN\1ݘnS}doůgrŦҮ+Bq?x.uȌy4g9䌘Z40À: %:mGOcTۢ! RDQϙx8}C^0tW1E\B N6@%Dv=ٌ7mcn& I\߬&50YjfeCkb=WguYSZ,v&mܲqE,vҖ08z4^H!\녦6zS/#Qȭ[K"ϑC̏(z'wβ{(e* cWԀ6g'+ެiRNYjɇ$rz՛6#I\زWMPB_5ю]"RC\0-I9v$OOl#V7]d.4}{g?{gr@t>i6Xcحmd\ɵdٓ7f_:%8`uU?)Yyv{ڄ*7>~ <[ Vm_ F4]snVSTؚܰ}{\+)yxђ m@S)~IƔ3"?<ҝ@8r|PtWY:n?k?wJ %pIy21S"K5O&3HȒNN95zMȉ=ʈK)gMd]5K_^gHZK,f5S9ۃ <b]ѓv#7 Zλ.f)'htu-bvNWOSHTr,)BݩLb.WdHAە-D/ yû#?71bebJ`JTe-:"Q a1\2&UX 6ꎊ-F%2ͳ-7GifM<kyJS@5}J񖒇9g$7&&Z}oS,zE5id!k&ۓx{LDb!tqXhK ûZg̑FEa8ܝkZ?yjsUj7NV&r/؊kj|\=oPW+U:7`K*E2cS~VwWrkmPPD[GI҈1KhUSR ya~]fosѿZZJ(y7G&ҴisMr2\ cYupbV^Ke4tGGI e*~E!I&!zwci^{˖mD2d˛b3}l\jG*mʦT`14 sڃ/c\dz<Oe|M)LIM!ko7ےߩmش0o[;Ռg+L[ra+m=H |:[]Q`ϠIH:a[Ek#h(cDv '!Ȯ.V:?cxFU4_1%ZQyzqyS͊^y:VD_d+Ğks;rDs=\\֗xKva:=BH6^hZM;Ű^?ʋz 6LjE=Ydc\!^?ËYC|Tw;cUźn<*o'p bh])9'[ݺ:d!GI0Y6FZt ֩b:\Nf-%'D,x v69aA|î<`LỘR“9lKɉ=",$`pz9pE@ =pC[֋5p'[?yC1Lbx.ByYk+vV)!y͐>2dg) CEBJh5#Ao7EO}vq5Op|!J6Ceה<ܓ?R^ySɎC,@)N7mZk~s#`՚c? mTOK;~n)y؆bR kr]?y|CGҳNXt/ȋ$7QՇIf:q/.*mqrdݗکK@ 2^ݬbIIDF+5yh+O6 SYC&'FzK;6Gs`DXmL+ U L^{@\}Wo$)pK.}x^ڷr%}|*R9y{Q7C3jT.K!a[|Ɇ燪xS=_>/"𲾻|%|.5Տef G1Ek0*:mב1Y=}ኙsg>!|s~}|#КG3O""joh8VŹ9%S:Ts,y:ws7B1R ډ#&u^6nvU_rSWh8j+tV X5VL""k B12svC@y(.&Z 펫>v!1@;v  U`: ( Vh7ߘju73|xL@*)1dYob#v&K޺lr#7nPsZmdS ;@{q0)̤lxaWXAsiqIKzh&yc$`whM\ځTR gM\z$vn*UTAΔ)`-αj]Qmi W bVB.tF3E nKG!YYƐ{+țkU"õ9" ]J*F^wSfԻ'K?GnXJ!6KM[4lbsatѩ!#cpq w?"sդo>GKta41@ RAkI$-9u(Gs6lr>u`ji-E/%0CO¸y|<^, oNiD&B 6% ^Xu?_~;a!.ڜɤþ%Q "YGֵDԥ#A S(蚳^oNl 5 ^To%QnEٞkr.xwj-FQ#5ۅw;Їۧ^.ZRMy93t*r%}_qZ)APh]7Aa[xEQך ~nÈĘ]wn}wCY RMg47#!#xg' &zzzdƴ$ǾEUPHHLwUu]6>W6/ef_gR U<5i.er|wr%n<؈W. 7 S`тS o\ja)b|RJ'$2D`{$DrQY)ZLQj:q=e&TM = oc67>X벓@.7nZEbjIke>)#Iie%#{ ` E: cc*@#潮pu\7W!W}q*'1A[mtD'9t/+e3IJA'R,I>ĝ!쐕VmRwyehK>\E۲ r"Rv9hkr/+Pƕ'Yt/hxDI"F 6R"S7ǞP+1n!{/1?1mR!i"dKsM!x'=BD9f G0ːğHڸ$m0J 5 7!X礍{J>p&QIAyH7RDw[E Jꚳh )*X'D 8>@ ѷ~{BQN o'yv~NI8&tC0"] 0׳|zPat-%?kε!$w$@"[ !mV A:cN) U(@.Wwb:pGyeWғ9$5"t@_QIA~O6(dԆRR kM $/}ĸ՘PJI"jܙdꞒH\QuI7KΒKÞ嘯F ==9OPL|gid4 Bp@P0<޶۴5n[zFQ!qj#žUf3K p"F_3]+jX5J'9mJcߕuKrk2**PA#gTrA3]UY4UfQ'lӨS(TFnύ:%PqP-}D YGEQl m #*gԿhAj"\hAUG]$g; E6* S*|y&$H# mV}@l UpM@ "e@>VB5!`!4z:^w@ BhgS{W<)8"e g3hĶjpe`#i6Gh]~ P"shlBC+,t&Xn!W5 ;Z"Qnx vTSz# -"*(g XH7C D@^ 7 P5:O3$4fG%މyT(C5BcgU c`44"ЮVj1Rױ;qCF6%ڪڹ%6lj-+FEP)Uʭ~dFDȾFXhѸbDF7,Gg1+FP}Koف F9B*5) 0hEHtԫ7q)9;$+E%uDv5><2ьnG]­bQGjГh'MHv"Dh]~"sh'ޣby՚$NIs"BU*z#cVS3\ "245{NJMl]TLBCOHw";rfHTͼF9<#a S9 E6J 2\1bR%ku"RT6>P\XpW(H,vHqz;Ԝ8"@EhWl!!;#+&(hh52= Hjo Fx{j:Y8bj/g6zzAsޜϮ]ӽJN:`F^7-Lpsy^|.5ȃ~dĭKJzb&H'4/֊mr_Jgg TF_V`T:GeIHW]t\z 3"1=vV8prOO9z7!#eop#yMMnһϨp%+E/j_bg;LF1vJGP>9\ P}]\[` n^᷒czz<; /\~dC UӛW߽{=L0&uuazMx/fWsw=şqu /Q☡cs SHW\X.}@{2r%ҺQz+nZ@\iF攠BH+SwaEY;5kacv~qk<)7V,z".bG>왇 S>:F48D{Gr&acGL070? YW-Q)n!Q2xr J 1zpqNO=sMNy]~QO(xtqPyoid Ӕq8d" I[xm˔o!`_ކkޖx?y{"!TnFCh>Dozi|Pm)LE9n*;z_ =j㞭2q<_PkhZ_$T /aTJgCa`P?nr.JNE:F5b2o`y0ZۅZOR?ьo龖>_|~><3H*{ p.; BkRwG<'Ϙ*-sӞ] 4qۄL2TJ%e0O Ѯ{znyE}݊QREk3}݊9--HB6N ؄s?g^r WLpb⤕t[̀x`d0tv,+ve+W?6gWŝK)>Y\\\?rM{6 cbr4Pct#50Nų.R$w XPFJ-t=ϪۦkͶ}`] }r&ܮB轎1H{ݓ"%R-z:33C>F8C<>R*~Jy])5uO h+&OLR0p0>ӻ Al>/Ro?*l 4fw%3,uMRg,%,rA[q OuYrqr|L§A DCϢ1yF[&,h6K΍Kyy5!rd8ai2':uJ&>7=iPRjsS . SJ$IWNsGhiA<]eCfH9b\_YfO?D^zEώ&=?<3;uq˯qͤ|g;1U m+xd('mn6W*V7uM4lh{Ҋ9Լ\v""$ˣocO *Ҡ6=Pr^m5NؗgʳL@OK^he$hr (!az:0;-g˼{{^N.-N WBESAmp|w3d` ^гwty>;]%wW;ˋ̿ CH*{v||=1Ԋ=^ͪ S4Ւ.)KLIb6 :6lnb @mnl,'ZvJ{Bٳe:>k\)GEpkrϨQ〢iq9хDD)y Rtʲ:r˂ ϒzwB&J-ܠn3T.&I:$mh44Z: NpE4Ɍ;ivҫ,L{yN2WqH#mn>2cu+XlkpWWH??B2K!uŘQ$Vz3(,o_۲J.͝hA۱\ rBҙz!X9[9..]͜/}PtUqѦ|Yfl^y6:l4oMqdzekVvoIXt,-~Bbh?b)#Y,SEWSFյd D7i; FuOBcXN!Էgy"8%[FӀ։Z#3* X㔗ڱNy}͘.8W:h\`{pIz/bn Jzͭxf-pP; hf?H?FL* 3W']EnH5qP)WPd bВyRq=g#PsAهTS;BC 3 Q3_=un9|RNQl]4ZCnU6LUGytY qn)E}|5!g9Nyz|}p.I~if"9}4}K~&_Vty&^W7y4}Y}g芉̀}j^\^]Oͧ1;-jO--;S P.D@Sg/'2ZMhsҢ6%( |m8LWRkɦhmR=Bn\$ւ87]z$gi2Ħj, oEeGӕ]$ywfg,VuhLBq 9afyL0[EjV Ih^ ΖhFMqQhbTs(L5U; h/P7i/Sq55\FJ!F)t((គӍ5({ҤzJ?"5R81gd<ƓծF]Â5645#$RKu]~vc7U\(#}Y7YVvS'}D v:- \~j&W.1[Ԫ}<ؼb̘}%<6`s+=ՕkԾfqAiEns'x1 )%z5gF5c*n:/P~x̟CY.#e9g,eg:rP3|]`psG6ZeݺgޏvG4 @jՕk]}hT`p@M0o_uJQ-^z:;8hR݊T>(Kՙ)qYZ".<MQKgP\,6 鐛P1L%  vZY-̢o)NS:+tUVǦ G=PdeْdΫJrRt{L^J9.ySKcdίeeKM[b|Z(7$k=!$_y@g</Qfw FC)Jd[ \zwת;++pnOưK9Kt)Pt>؋/ihBbSlxwL#25%79\hI(k2lw; J3=Nd$}vKWM_7;nKXTik|i$ 3d߀H2vwkUmƂܑ5jBһKg#ts{L#(êXBS`[^qTIΌTȉ3~EH[g5# m_}/ n>s&$l]ZJ]!}3J&Og&NNt-= t X@b4RKoɎv}9 79h_ڎK]-+1l>As*nqtO3ãsJg8vԇK}^v.o*ƣ5zmZsmiŪ:9ⵯ>-N ImeHb01^=?:p='ӣ S7:J;'@o MtRgpK`v2(aVB$E| `ep;ߣo7G@3?O|ë1:Uz9 4YO~_!GYiؘ+h|5B2U}qx]NGiUnm21cf]W7M?O`e_J (;A~weϪYyGZ3L;0,Zӗ_O'G*Ǖ4}jzGT^Mya(5d:cӨ+ WA4iMd قh}azA+^%TݾJ~Xdn[~RIա^㮪;}A&8pr @ݧbZ}Nńk{yR%o$Y]S-*jZijAEU;{˒s~Nb ZBR4I,}|l{~aCo4YY/`4c{WBoq k潤>]gv x4 }LlwT$ezP1>&{cGkKU~me?_ pztqH$4=J٭=Q P ZxhAbk5 "=<_N_^+%LJHt BL~oVpƋ;׫yp7:%8|E! !VCQh7 \Ode34fuLjbq!(i" UBtbqeÜ-kT];klkP 4VYf#:i,R7(;E- 8^f5* UFYj: r !qw>(~~[x^W`sW&Nǎ9ZA87%z-K{{^Dl|ek*E&]pҲWԞS6]|G43!N R]kfۄz2[rbƷ\5*F 'tH>7 u0hWэ!5Ÿ[^N{O1C0 #%b[ E1?PΒw5f_&Ov,3e8-.[b\܍yJE:ɹNH.l99[5WA%ՐW՗.Cդ3t93 T3dyò%4yOK8 Ҵeo )ar­rAbh+YPY21FlͥP礔WرJ G[V_nNSi8M͌l4e"c2y{;,ZNNzܸڛCܿ&]󌏌&v&t1^\\M)Gܼ{>ka}')?@9f sB(DbxԃtOOIL٬{Jݧ0*Y♳LiQ;aR~<<'&1cq*S@LkAkk/ Urv1U~GT ׍J{[uWIS1vǩ5.ȏScǩ!3r-9G L*[&;@JDq_Y`"J S߃MAᯖL˭!VBklޖ#`^s GW48`)~|] ̻]D B#Aa֡TܤTG a dt#t(Mn/KT~}^b c1K0Ԉ7K0 j%st36ŰPGGRqYUsV"(!4/=M6.9,_ 0xƂYyȃCEDX $6P _8mŬmbS"㉙_ϟӴ?O͘ymy4APG΁ co8%搓0\HcK*/m|;X!bQQВsoFu1JW1.^EVUdE^V|8x`'dQ[!de`(í$:jJiQ!5S`h%W^G'8֠[-h(drf` ʙQ8(ǖp@ EՄ;#AI@>O,b'>:lăՊs?,GRŀc8F8Y&Lp&zQķ\2싰m̈3́6j%Ac
O&\:!lMǍ B0f)zB[T@+f>݈Or&eD 24#)\qҊ)J Y$$*o L O7a(XeAs4:#vAa%X관fUx:,((7];7[Aw7oL9f66캘DE&z+>` `Ttȷ`d0Hb0BZ,ụ=1pyDM+,V &" (E%8"שMA89}|䬯7osw3+直Dqғ Sz# ˯o ]<|u}ӌD{,eFixGb ixl+DlvU}_y]6ll۰b[2Rٖ@m;J_|ދ۴)G/]I4(6hM*Q;D Q9MHZP0`L=>Q@"B`Û"hɈ}&Oe+w Vw1F SqMf'<DBv%"S"AШ]h 㙘H |-2dZqck頸݆]M)nHqk͍;δjl2wgrZl Yu|U&H["sLt<푇]e2drRŽ<8 CY7dh.qťɕOg4 >&ۤ<kt 4iϹ䙫vA0,uDL|tg)xvz{\a_ES…~mtzn*?M1.U_%ӥW;,5wa|gr?@+]?,1m6?[S:uE}HT7` fዌvط0ZuVc݁6h+_%AuMB[ ~^,:m?wSMHfT&W});x(LAbq 9pDHo4N$HC/9+% Ŏg2_~I\~]IP)XP{᱖KZ|XzbUYPUns΄4($^DwE}Vdg]"qөOqtN # |CyNjupx *-$-hjg%S+`JQa'dHU@i:(űQJÃWtOWz'-T&-q*4)סiTCNpS,)fDx`Y/F oR`2rAnoWEO{jLxlM.Ա¯a|0zw#X)WZ5f*\eַAS>qՎ6)ڝLuʲF<'_18Z'Jp0j.Uw?VsZaFǷ-A ߺb"OaཛTG&|/YcsXh^U :Rp;Zޛe@pYN,m/];΁Iڔ/JQz^D&/ïQץW~Wމ4?;jSI?y6=ؚAS78Fv٠Φmw?4g/_>(rK xчi&M>9J_nfv6TdQŧ7T{x4r?|4;ytH^>ɋ{~-z0?Yu4 )>k|%_T ϟ<5dZo o?7_הÓ48:Y.“tV{x^̏7]WB Jw6TSB-N/V,fc_ 3F+DlZQ&΍\q+̾,nYUmf>t洡6Hn5\{A*T=P ۇjZN[Zރ[j )nN;@1lSJں+!vW ̻k?}󅗢'gOoըuŹ4#eaDӧnHWc ܒQ]{ZFu{ҨbFF[8@CT [%.$uӸ+GӚi E^-BJt`0} ȥ([S::(hXMj >k L7{2kN.]NUZk9*ےmHUb'BX`B˯V9( 9lSQT2 ҩQQq T4FQ HL"Q\b.uTLf- ?dugPhJ̏D zw:Ӄ _Zsٷ){=jl̾۞$ǩt;qV'v geiDBVbM57)rqQ$U0f7Y%LSu1LʚmJ-ܼ|MWfs?^}~|2"XZ.4m_Tn֩? ohUse*5ٺ6ݥz~z~ftl0N1 68 1Vƌ+ʘ=k@KN6wz"x.%C(~,㒝?=Zm? zBJԾ$Chz*ҍ4^ %ߣD S)BXQn< dP 1v 0!"=(.|ndz/i~c/_@'Vo/_}{Agt 'haҶuz׷oU׽A`PzwL'xL:YMT FA8&$WQI;l_ /Wac@$S}Oj$IW^צ ?_iqfo60_3|2gS=,Yf0k> "L|=+3tAX3% ;qD (#VTۭ `H<@Xb%Jm. Ó8yY~&Y#S3!WDYQ :ħ9v8OpvvY#'+BGA]Y?JBd.53SiZtU+Z̬ͬĶY53ff*o .V̭88䂐"@cƒ S[33`_j5);3Z\D]i>BH1ɰ<:\-~c-DuR0G $*iLsF un8Xk[OT ڴ`KGI(#GE|s1j%q޴uϠx,p $ItØ&2hG)AOS ,h~-egƀK l$"5\b ld%7l8K6\kqI> oWpI$J(@0n%%V9(Lg%T)^<4zhc`ؿqÀh΅|yNשb,qx1,4F# SR,ʃbs(8n};> \;}F`2Ac] dtʜSs,:CcbE.!`~`A T rG DT Ud('LS"[S݊azUFn_Z~:dxJWkNTt(o笡`Pp cv/zk+3wsFDt!aɼz##kL !ʝB֖QCF4Bsqı-e *e \5 C#O0IYb{j*dXl( ;a,;g¨s97+A.u(s## `L5 ̂!qjFBp*[P.U͡'J)'}sgk'+o_L}:跏KÑm"ZsY\xo/_Ȝ=cl"?@v"&XsJ OiDbM(FK[*%#y$-aXkFɸKKKKy.U@RD`lV@C<2$q?e@"2*e"渍JJQg!Ų FU4W$ w w8M5З]qLYB z'E3b%KO'.D(ô: q /2"/zJQ :s x'X`A Vl6BGuP8$tCN̥&B+(,FG!s1 C\>SuǕ+]X\|zS"b^̔V&3k2![sų8[EhY]kr8Uλdfν/3.߄ 'ӭ:=.oR7rNgJ*WO Tj!"DL(O7;~3m'~CTk!N5N3\7L{=xW^200:u䟄#/5OMwNxH%Td*h^LY=*Or^+!]6WCLm.A|٠!QR3E#%?h[&EIxtiUbzqʤxLH`#Ô.'IzI/G'1  mʡRgc3R$ 0j=k!FƗ>R>\d`W9`0Sw;΄6fج_|^/V1Sq+x~_7rh&JAcV= 0@E>ͣ9mp$ij3]MCv7/>ʚ*&Ʀ6(m [IL8V! 9 _v`3)Пțն8ۜ-qK}R v -?s0Zh?R) k"dz)#~SiȌNhIkRm$ԦGgW(ۯ~j k7b'og5оۖzs$e>栻Zl|JپMufDrtcܾ!}FfwAQ5iizOtǫ~fEzϗll]_aĬt`n~6ۋ$~ EtDw6}+#Q&dcM6j Vʝz粗ѯX&'^6KƢ $=2oS) 6JMIژ?K*IA~翹wwWQ;v]a5wQ9<6m\M'zT$xfCf-vY[pmn:db؝Ы'-VQ1ɫ{!3CQav7sL.k@qC.gψ1"?gI^ԇ(7kS;%m(TI"IJFv;]^TpҦMh 6O'*6%O爝Z>q0Ыy$G==OmXW/+kn9Vo\>le.utd/ofF,=fu,~v񂱄mǣf_=0Of5|5*$9n0*CU^$aEW SV ށ; a:kz֭hP)J c5`mׂW=w{4 pP^jRNC. ke_O/Txoߗgf9vas͵˟MŽ`ޔ# X`QXW]KI֕zϝ14568Yyl4354KWv0FE"xK fI_A`>\4¯C@ÅXNA8|ҫb'甒t[zjf V "z aztv6v8;;ѽGӸ*7uR% $5^ ($ed*?#\x 8uԫ{"9X6qdwE.żc~:e_EtF>DZka<[m.Ԫ,௻~T0BL e܌}j~5ObPPMddi1-nlj{L-p sL]OT:6G-?g?Ϸ# M&ǥ")$LSTIRN% MaA@n3~&l.vrGg2@kevB/[wyxcΤgtIVŽ,>mWD'rqDGO| w~^ˇpW9zg=}l4dn6O͓rTOiv ɏL=2\Bkw얉3e6g]xB8=JD, p $l:h$drښArrX%zcT`puk0YһD\b㐖O={}XR{|ޘ[bcB[7)j'}/xyb"n^)̚uq6ZdX-A'7Y׊*ҹ UR}\ @KӾz9erT4d⒰9~M_GtB _ou9O4n0"y&Q$GiB \.oyр\W 7tB?N=| ʚw|Ȣ>WUy^ AP@1NNr Y@ S! }͆0=CLlMS1IКj>֧0q}| H>7sUrB @]ju}L۰31l`T"P8^M'*%4f;q3~gΈ-d$ DXT + @T P &C[蝩XSԣ9vD9c߶5piݚI.O%8E;eY%wd1,R|,V4$e#'kk9S0%90')ȱYbIaRm RvYRUESr(րRŠw6J04iz~PƊg P0p/ \MbpCIQbDx5+ % X%12IHb@ʱ*c<ű˜+*arN$La,:]\R$Z^qKSyP*w-c rLĎYKD%`&8]"tLa3 ]Wq ѧI*xbdRNW1t>8z sZy,ˮ ";Nzj]0Q"bc=[/>.!^G=|~Cy@b̔?<#X4g9A e6o֋z?׋Wa Rz-H'tV\iuYffehG<:o][PkERլ7;T,{ C6&")LY( 4LSÍ6,/B0ӴWbn?M&m;/.L*Gv!!Z GI8^ptc6 bm'׶|acAݵ '3 G#GB"1꽁3` n5T@BI9u9UR*Gϑ 9ԙdgbhLt"V \VZ΂x3鈍c l|+̆ڍЋ9v-6_a 1O_VH 譈k5mjTNgTlmi:'wEJ$a"!q5?7#[rZ -/bmVt**ZqQ/ԧ̛:4~xY3+8DL4u[P6&J_ h/sNHrS(0N{>Fw|݊p~h6}ӭ{n=@J`(h&P(kenCeT6Tip] ٳ!J .5rw:} UmYpS<8ZLho1ޣk1ds]&\H4""Q(7fUg-z_{U^;$9-^g`mܻY}E)[J?޽EgBp++n'UkI  b& ~D{4Z-t2%gތK#,;1 AhPKOv {85{Gzm5#*HTJz~o򑦑]Wd }c`$N !=:ywcg Z ѫK)QMӗC> Fg\ ,stfWZzi$B`rϐ!P;\q8,&` !!J;l˼{P)i_4>X+䐂Ze Z-&uQA2‡/3l,g?ԫ{[هipohp#Nz$%g~9wђ{Q޽lj%yE^6a`/odݗ#SK{|p.}0r{u7?}qy^,_,j'tb g&[Lj2B=ҲXU}LDlyEjt kSK1;+E(dMn/ٱ8M"˼1xS~^nMy)?oߔ/Ӌp6#da7Dh{YJ6FmTKjװQ*Je|yYXHy +Bt$Xˉ} !')0jA`4tnNˌ!j{~,0}y~E:)R 2>~Ơ}e={qKUfylɈAVPW̯04Z$K˟fe3.&rIS6^y(TLƲfկm18R jHPC$iԠ1D9} K0EʙLH!h3;B 7/ ˔[H, EѳFnOv-IzJ+ԉwOjnV}Y];aJ[Wڐo9,JKA;heLc0%IvrjV*#d=b} z6l6A$gДl٣`PJf)[m'Y-ƹBirX$ɆqVMfU旌pήJ#6FgD2EoL%;AND~(6A).%F%Kbظ XzSFt;K?W_{o6Q[ +s|㼢J&Uo䛯??L_좙i ,ڎ/œ? 3g#|s7oҵ;j6 %*?Jp~[M&ۆt$W0aۈ]]P Õe4q4vu8\KũcvØJIR]"vlݮ lٿO-_"t532^xwhEup tuWlGX`'yu~Շli7 䭄?a*c?I34:CrR~yk8f>Xm9evMNf_X6;Oga/r g]BlF<d3v:Wz،}zrwq}h@;yFe F*.eZ ♞yaф> rV|b<Sٓ'Ͷ!05hD_ )[@o0y=ߌ{QIw+g/+T̚g9dLu͜j~G<*s?<4gSSN ͳw QUwЉo5m~ͻ/."ml~XJLAT&,':Cr[Ć{B˼ &ҐVjp +b#sRz j :}U[p붷VmyaLŮ-"V<ݞ4%Z˜cTDim5< yP"i72Iw׋}~Nkt!=63ѾznygNaI!cV.z,R=T' fɺwf=`:Ss!yȝ/(CR|jH V{po,|rq##pV0<nMy-cJ߫Sw˒u*]ΦU<L|{w9{uVfz7e3nf;co\R]pśg`D!JGuq5JDSUk11R[\Vݽv:/hک].GckPFp`Kh(0L9R\2c"X2εj_TU;:J B> l XtګC#X;#NA_IFϢ_)"RM:h:X+эҊ䃏软QZN]~ZuQkx,*f # ];B%Dm=,F)Mk A԰~%\ƪη.e!*X"R݄pgMcVע\uGvØUڮv=d *2WlI*Qɣ1҇ܤ1p"_ަ<{iؠ-oʷ0:) Q&AZz( Ɍj6ln ~ x:&0Nr±O0DIoKR8u?oAf7F2BFADh+̮!=@"4<J=o[0̑U"'Xv+h{T=ZIA)fZ[nu1ՈpT=ZI9ݸ P>%(i:*1vlb^z{J_ ՐIǐsg#,'1{0<DJ1\/n:4 } Hx"0ݡD'a7бBq c6Z?V@?Fnӻp!CDDp5TڏMyr#~5"zݒu^_xQ]{A7s:_*kV+i*w1R)8DLn?WW|4/չ/PNFZޞE| Wۇ"OԄtk poo~ݓIw'vG?XngR8_=n߰p~N+y].'I>IP/6֦B# > )|4IF̜!6 BXERIySŬIV7ofmZ5.IvuR'xx}kYVRSQ and*??J%4ΣmhBzʪ˳R.@hLYiOhi7(N@Ծv;"x$۰oVMhW6|"Z%SF<\nAsDV˃.S>^s]g2n*cMwqXS+r0xA̱FP!15: hGErr6 PUh"Yk!HIAfkP[̍ΈdDJ5kf) 4$ &y*$JI{duxk}nF KZMee>ܦ\ Y$ě~=CC3ܱYqcM%-c6hW"@2ưR%)r0Rquq$+#pFG93ie<(PE4H$ǹkD%0u 䁗Ⱥ H|`B[ y"HL" 977~'^(M??LV"c SbEl4h=c|TJ0h4x9P^9dto+'6q~ӐMqOE{D9ZZ?Nqr9H6{yC1}63a Tjd6U$}Z+U"ᙗ{i Sњ/i0vEz'~#dkq-Lh- I:߃=:ܻ=xO-gkeƼ'7ЀqcGL{WiĪfjF:G},ӽe4@T q:WXewހ!*)pR^T25A!s's _!Sm|ް?Pd.?H$C <0@('; ڳk]I ,f2m=ە*@ZOd//{KDr۩-|[Mr90. _bIUC.rI;\z'bwqC@@附]H$1!h-!附H*GC(Z"萦+(>DD!$&$ ܍d/ 5H:-u)Ňԓ8^Wѡ5%,`ԕHP28,1=jIXtH"ِ;HDBۼ!**HHFH$TzOoPjPJI nE#:"YC{V fJ G(ZN̯= ~];\r!Liܙ=jDUguў,$KL^w#k-! Ja&!Ի LK;xm ,r;V딐&%*+,2ކ,&"IKdF5hZ=cܗz {ngܲnvLemy/fY;?g4a؀ˤL,,Ǭ8m=NĪ-$Mkk/O'QxO,ӷ#V:!F}WrĠ)ё<#:`3ր"?J.hm#`PQTT[Fs(Wۅ6d cBD22Dn^ӮGf 7N79:y?Md$0ǐBqbsBwM'i8W;3*jNRx%V;Pf(8Rr+nҁ n_LZb M ) Mͳۛ| r PSG T8:Z8&PKLn9}RŹC-HHr0gw ) CMсQkOM PNז U-ǠZ)H#y]j˰GDE+az^XsX08x^D njP=rlr`=E\O18$7MSCoG<+a<2Se<6L׈}yj}0)+^y&LjOMG=**U8#<?Fow3Q֪S]](B` ֟ f[^Gj\m8X1s?YxKqe)4ij(,@fVf]?x=ma(In@ܭG擛O'A&gWegد`S\/ [|rFbgR'KsYNl:fj.0"^`)LXU~tY^)^FI\N{B2MEgY6sv P|+&l=6w1)3,B׳UaՄ "D 3 7y _h^D0o5 x GzL'?y$lGK'Wx}d(Qx&r~YVnkbrX-/J#:SkϦ.gVf]``"<\EViϮ}z栕?l=7.w 7Mv{.NSԃB|?6Oo6g?*.5OT*diӘXgO+?ơ!{ÿ6 j@Or2׷G vEJ݅"dL,4EW~^R{uR9E9S썻Y@~n9JfNLJFJOt(ِy:>FFӉR;@o )wu)w)ӮBD`8]ձ[Nec z-})E*<: /g>\d{FrprM@'Nh KA=_찹hR.:n"oLJ2@.nukv6_:\]̼_*}8ن/5q&].'eG"~pj,krD\1e}WA%XM7|% Gם^֟g[f Lo8֬2|#Axi^=MD5T:f=Y\2VVg& -SB2\8OY' 0F_W7Jm"E?lzEM6m"Wsoqk_-҂~L^L?o~O2/YXW_n'|־?#=9NDL"SIw$1Erl9(G4bqTNOLmI-?luv 2YH݇|*]0~QK|"^^g?{WƑ\pwV_(b.Bβa;Z"UHJrWC.hWxhS]]꽞o+K.0ڞ8<5lg_]=d~_G!竓 \=VTFՎp-# AXmϞ~hs6<_=yt@]ѣ;gOϿُ>{s:j׶PD|6tgO|}%oq)O!+;fi/qwt~ɗ|2CL>a8b=TMFewc'Ν]Y={m :+_zmn Lk>* '?үRR/ P5T󂏋|Kgr =ֹ.qx>̀ A\vŻ=%'/C-3ku5i/.J>m{Yde!8P +34 _.V9o\|lm0lH_IP~{GnɈ 7ˍyg:pg:pg:pg:p;gMo0~@ÒR4l-h"@'w:;;;V_%v->^-CvعJ|$w?Ci>S|SY$uHz*>6#ORΓԗ:ĄUw6U>~sZؒW,u86PXt`d2Y4|;JԞ`ކ-hʱuzuR׫wv@cw' W~r~3].'\ᜁ' ݯ>^v]ۑzx˚Ѱ[~/_YYv&g3' ֋%aC5Yy=h7~W({x&ყ>)f<)O9s<;lcnxJG=5{*@=a;;+qiِmKnk~fqat6[ko٣ÛD>k!x\VnlE 9gK?ӫXN^@x74|'H'g.fQ[ c7 -y%C<0bs{m?3,y.g~~KdޭWS4>)hA>?o{I=C 5~w|QHv^fIIڱfke)ڃ{ buUgKBc+o[-zkEg%7] %y-o>m@?_Mr2'Á2НZ6x+ʧc Ơ:;Ss^)ǜiN85'î;kskMŷ^cJ_zcΖcҤ&tg]̊c/ABAi!ZOp:A9K%H7f\g,NsZ}/d[_! w Mҩ$/pw䮊E;V-H:qk)EHs+:X1FHSSAڦVssh=h{qѻhs`H+er )xx-ޕ1bnJ|v6r}{pKH06e$/3`D= i[R&bm<-mo:ZmJ0_+^9>@1ŊKk Sa FpZaD[{N%%]VpFMȞtI{=sd1*[ԈjqFn%Äb^ط0,JPR*{弉bSl؊S&\:%cKo3X@+#@Kc=>g_x~jޝ >ΖCO5?&!(KŢ$ƣkJdhj>R8[Q2 vWJ<j=iJW-Fz\4ʧbx01ݦ:P>tFiNqJI^"0v),9ԅ"-90f^U6yr΄4iqQTD"r6HO$00ZǨƥ/(P?gm6j!C2+ &k=wZZ!hxsV%`mX_ LYtg3k2"`l.!P_PAY3'"c4TG֨L351f0j^UkfLC[JʆI S7@C+q@_-(r2BՕMc")0)k Nâ Wj$졅DugE@P5&XP.YBZ/EE(VPm3Pc. ­[0,WnjEÔlYejXA.j KB PRm 8qJcV5)QDd ۼͶ #;dQc hsM0-v/J*=C#}r g|4!DKyWaUJ1`^' saĬ$ygl@'-Tpflf NfO+ࠁ82E󀢀LP.l(l\KC9v .ǍEEZ* ]ݖѾSDcah%p@&@+e%aL%Pk<#~1q =Ga+QE0TNn+3m z>|v^" ǒ\5&0r eFuq;)`Y -(ܑd&(o@|2E*Ay!b[ .&`ڝydSBзQU)#!/b)Ü񈝻vzh)@PxnNHVUd=2?8+) AzL@-d mPQ\y z-pSQ0}1[ZQE!B d5!,Mj,|A\xOP[J28)ܨi@dLEAKbFHKH8 O`ch%#^Qgk#Ăx Q<"xx`aRavQFqOF s–![d$ ƚD2ztyQ'+B,UǞ$X1'1 2xhD4(ccP@u%>h@/&T(T*!c,_!1]!E4h\O0FOZ%\V)ؚLg54C7PX] @<KBbrrFQFt.vpF(*l#ĕQjc:LˤUZ2(4= DrkC`wR|O !g4,Ƣ j 4kY!%7 'jj[ 2`?aI= wpD8Y 5 AJ5WTD}XPD0(aS6ipl{lZ5kܜ6/jq?yBV|s(  , D^,8zVr ׄQao1mg2QGíYcҦ܀V%o m2ܢ)֓ҌԳIMGvCRB^rúSiCZr!YI&M+C:n!,5 ګj:L~'t,ɒf /oJGP5rLsm˵gb7&-tDK cQpCĽt?Ufm(*xK)4K=ImWa9L4_YnK-@)* |jb9$&KkHV:J#_0pߐ&@C{Nۦ}56CY:?sSU s%quLbTkan8J \]"űJ⪋]y8$Tf@  YkK˝ĆNg %wbyN5&S[~|^ ]iGOZ 2[c )5uZ=Hzm&hZUwTFc1`;^q# Jkf o7Jz>z.m= ͅz^O_T%>C/NKQtvsBte' 9a)8i.f7]&qΛnyey(Ccb%/k]Yɻ;띯ޏ~dv[4D}T? BⴶeNt(hW%jc6iMKH4?A~~|[En:A>Hzș.xd?]v. R:A#7w8t~W?>r39be,ANOԀ׃ռ5K z1ܳU Wч.>·3 AዷЀ*/dl6_ROPe:2tBa9LH-Ta/kRd 嗵CqzMZ,5kAf8//+յ:93W[aҫaZBjlɩq1%!X#|'Rli%̃]@&x>aK*5G'bo`YJR@'fd|E5>tvh0qWhllm}="=5"ܷwMYA%75tUI.hgҖ]- Oq7lqa8r2_P`jeZrM]Kߞ4bm!%qp{-|E nv9tZcvyۃV< XP2 apMo=/痡}ua7ȣguoRXanW)W00\.| ̸GWL5~r4_ww|+ۍ/֬I|㱽vCv Cl#8ebԖ-[Ӆ ??BX=n<u(U .Fo]^`lR(D5%ݜ.cMUC!aVQ*K]Yjp&~^/ IHtw꣑|o]L}mJ"է|lP3EFlM"fb%^դfAS9yr$9ȭ;)n0cQU4-G4,tzQ[@_Ҧo֤hۻ6ರ,]n[1P/5X<+,?<-L?<FF F_k]R+{JݺJ{"sKmCkСy1ȈVR|D1SsqQr SO^ܔ-%xrM/RazyaOuPly@k^fVڶuqΫ5cLي%n㪄yy4̲crVjTyOr0 YuG?:[ێ6wB>saSMJ^|sӳ\ e}A~1+k'zu6i?j cZ5lK쁝If ])i)fyeBg͊{V̫C3ݢ 2H\Zf*x':M+kSN+sy67D!/rfŭ} @bS|w?AIkTdpՂX/?%AŕeGۡAp6~1ldSgl$̓ &iB/#՛fYXX 4KSHXU1dEAs_&֎KFw'ocq74-ʪlܠG%oy;zz~GZXy{ zl>߽, s4Hٛ[7! #7BY >acX?9)꾞 Ae 7P:"5!ga40JFAnX:Vjj^׫YC>f+tpŋt,֠'wWí_ܥY|t_]m.JU_PW^'1_"O__WEfŭ밋;TN']VV[cztΆ QNuMgǤ{\hIUG'id*o'깢Aɬ*PE)pmti|/"1|'8:+*!,={?(ׯ,bJ\L.%uČW"=$veB$'B{gȔIF&bgYDChs 9W1c}c٤lpFN-еhYE+^H&E͜EmP:ƁH.5RJK J֋{?KZdBBL"X(FaZ@b%sx="EDϰYZa BjWH'y&)1>$"GEɃ5Lʜm6Ҙ쬷Vf9gNUR:BV)#! 9 ,€cJ޵a1wz<\ pׄd/qr؇6pnCq,&66x4! ĩk| g"ji:x)BmZgTٲ^k 3B)"|ܓJ1D10F!ŴQbP#`k$DŜMԬ)4d FvPTRmdO @JKlxB*$<$!ƃDȪIk-I8\^n%K 0RVYGٺ$+3M^QAqqJHɴ4,>!$ # /`<6s;2tiDQXUN5CFp:d^dQ4u-sk, nCjA赔.Q c"D ؜s! 24Ƿ6Q@Y-$,hB0ޥB#@("2;G 0e8Z+80Q2>̊"Kؖ@0i,," v),b m=1 J~dL{e)lXV("Ʀ"aHB` _@ ?ˆˬLJ N`%#Tv`I yO`, ʰQXALAN5+%y𾌬%T;W1a`#40f51#Vͺu0K>o4Fd%U͒#~NiNiiȸ% qY:,` xpn#4&`5_eR!),AL6))F6RSiiAZtsiZ[fѳp l$ wbz9_R/)1-RvW !)xg83l/u٪}UYa^@wt6eo;>$&EX=,jYFem&[kpx6/ H < "C ""Ô& H!=LWb 1 h+"h ܈iI!V4fa9Ca(*$=F.6XTrzXd+0@v˥щ졫;Ͳ~jSQRAIt'A+bkWssmwq2fuE䮬%B>L{2B xdPx}.-ҋ^a#06Xdp_AK]"!EJI/ %t e[ c;&4 ChxB*䢎6U9{u@8̅%8FP yd*h5!EmTW'Aob,[!s5 ӠALHc`MLe@>,  k\KxuC 29Vp"$UlQt~XN^% qfiJ %޼Ԁ6p*%UrMeEXY$WV2RĤN@g LۤJ N9i m ZS&1'x Odt7}S{~IqB-D\"uӥ2o&&cj`*6x)_#k/\1$x+U N =r+=75e>qG%1`7%}Z>h-HKp?XZJi5CZ`~TZV<{E\I,嵖.sZWfWeR^ %osKh!jk>aF%1Lb\joD6G!8BVe.mLwbW)Nj:@7߷ReܓUYx>O>&O|+q"Ӏr</>MZlyQ Cq a_+(j^+JUzD(40]*KOmyd) qwe#%_+>\#vG?Uv~W!D^5 Ƈ҃ &570>F#m702q#ù(, /FZT)xAՉurMrDžGBT+[W j5Rå:d _>V PM. :f)8MTSׅ0_1%盕f7zKvt~*Q߼8>2m62_oEfq|k~H@0] k֝.c,lUOQO";J]R3mR*_&#fu3 ЖوVY]طL^9~{1)5v zi{,;EuQ4@OV[h1x96 SzQz ?\ǥlRfiiOm$w@T,"YD/L*\bT3MR66Ǭ[6"Cf P/ͬO_?DSݿ]E@׶sQ8W YY$. 'C.*="YNN]u:oiihnY-۩"q 5OdO?0Y~~8+ .{'z N3;F#8kxbm䑆uɺRzؙA TOA P̄hwu̱l7E%뾴>zHU\CՍ=UjmZc_b/o7؂jgo6;/7۠6LY= aa{Z bsk-KaiRwM0X|fM)7gwusʺGG,Ƒ+S1_K"m9r8e VZV>Vo/T*l~ U.WW-Y9*fNe}8+Bݯ<)axcv"ZPA6qYG^OMW8˷|[ɅWtJaRe3l\.c~a0ÆB^Yq-n=? 4R{ d"& o!~j3{L_6 w|v{1+kT? s 񍏃k/Wd"D#a{ƅAO(ſG)z:2wpr>PMF܋2ߎ-Bd/Y籼.x:W\2_+w$˘FҨd֝qV7#av:i0(#D0 ]YgM5dR8r'D]b>eC+͠*v8=svNI6,H=Ν `Ϝz hA4uc͕?^ø~0zD[ji[&VkqzS qf|+vi$&>4LS ;5,﫥3߾6 /?toc҂ro@n9|-Qq?M's"C0WOHO77eFMcytP4'o֚C&Ko {ۏXiܽq{XނURn <`mN̰ѭנ,wzw(՟s \ Qz_<3Ab`a>S֟Mz'g>#MmùR]1\gdxkz,يn?Pz>F _\5`8^u+w8<Ȣ  巧{q\tSk!4w<%ZXjZwSgbW-;+C2=%غ+qm  =?鸝L~9jHɸ!"On *1X{q6jv 9Nq`7 nVy zl ݡBou뽨Oɡ:| c9%(Դ?xj8x uJ{,J! ?t7q=$q:&ʼn(L&zjzsan}㇞Y3/=啰g R҈]HҨ3d0oNZ_/_wy:?<ΐ+sс/8߽4N!NF@{XIX,iAeWԋ¼Jgt WX_l>-=!`8LJٟB/ Xknk] ߗ'sGѪA²(<8~Bs;,5&< t [ a򺲶w>͏eTHQ:Ukq:3\0+(5 ]-9B1_":Nw/C$žØڝMfJ~S*VIi64{-] :~l6YNo?Og@[;3V9.Z36eƱyi,lJC̜ct7uA IN6ܻII[pKiFOgSVhCVC<-n}LmعkR:'zrM`p3e͡cԜx OiD~M?֫-_[n?O[Gg7WZzW#w!~!Aj1}j=\^4 vѴt8Z`@ +0fP~)X7C1EK m`"B D*/tI6@H;,ӆk`#D8 !f7kVӪَjt-$> K# X"si}pb;=\d측dHQ(7KQ{jU-)BD^ 09? L,QX46o T~)ZV/j)O-Bx˵˭n miQ@1}I]`%&Xt%T2If1M8Y=e8Ꮐ() &H@x@m5=%<9p/JjC$}<ߕ 2x ?İgⓟ*J`^Qˉs 6̘oz ^0t^4)}ޮ;v+1Xi3^-0Gvs8ubWA L()㌇$ Qou{rFoB6 8sg`'kB;^?^t4_,Gj<6Z~`ao]o7h(-3]*#;Ijk兔x|ӽuziz 0؞ƆCԧTu4غZ8S^+&f|I HC7#8mx,bãD/bR׮c>֞`hL<΂7AԾ+4˶&kdh܃Åv!2^?Iv,%C%Ϳio%:1w?=$NT! QHb(*X\%\3dd[=O*Zr ƲZf)Aen6~`vb|KjQ| \χ8%CIƨ! yaAX^zZsz‚3B-F0ᎎUϤAH[HG/J[ ۵l/LPX؋%%>i (7̭_x)"lMxw\@c׻]F,+|&`щ@ݳL 0!cvk*.V4@ObP 3{t>Xu+Fhc1XӘ}`DfԻAcd)}{ک llHi4:0^y?xZ.g[n7F;A}x]]wg (SDMO6(G2@ {Y֬0o uUyU ͱOmKe;NA:'N֖o_gyjr_B1(B" ct$8XY^^Ɔk&糕 ."[ERMDwscM6Ƭ嵗kQmFB(fF`l2A7@ @a_<;P"DQ.xꖏ5osmX*i}GAp2FMXen<.9yඨ`C3C<3=Q0.05OV51RA51L&1:!4:2/V>Xpj̇0WF"VF4JQfqS̹LAG  j0L~{OϦ> rw;i> RW50eGNJ )yȺ e|>Q*Y Ʀƈ R[uS ]oɽ"/{UPF7>x$7+䟍WQ/!g6[b+S~m'{Tbc[TKvPl?Jm`䨘!(·E"žEPK"ȑ`!%)BU֗pg- 'zRٸ@=N߻{'aPbO6LV͖QlGfx||f 1FMҾzUt/h[Ha褲t)$=K/>;pI ݋0S~~)$C bNp9[CI uHM ,Y߷USx# (~nt)%ZEt(Gbo.Y_>y$NǔV dX6^ ;\hP[+^tt>ản ^k{q^_6=o=>S;Eg4MiXe&bs\U9 1b\Jo ,FtmWkL~>)s/;T@~bO28Dz҉B*rLr.|`%^K:`N,{8  ckֺ FTrr]ӣZ%zLZCHln//~3Nv] i-mn.#sHwj?ԛMP9nsۘ9na-|x1̕LPg-yJ8+IyqBw8%ds?~W5Ŝ{[=@~r(Tx:, 6 MYdlh ֵ:~p)ߚommݩxT}nyo;j-2 ;"ʒx%gmk,YrUhCH ӟi;9+`=O>mxS6T)i'.RxVw0ht;kD?7Q6Jbx?{\iRxge.6LZY=%a/riyh+OwNK_)Fl0;cğ !AwJb2捘Г?+2|HsA^z8"aGGt4~΂JKD1:mδ*ذ;t%.hyzዛ,;|g7y/>r}Py(FYqz9֑,gO^/ r νaBprX o81GѣbX.~o5=P۪v钩awFb$+en`J*cF`RJr;Scvns[á^퉝ş Ǐm ջ5qV.$a =\0غf@2cJPQbL I)Ki4f*N4O50CS@PwC}#$<F.rZaYO@g]*~*W(W"̸G'BI  #ai{6.-e9hkA#?w=:NqWD,Flظ #l0S\ͧ6I'n\yߗN1J&T7k%Ǿ}m};EqM$˽nYu7^y(7WߔqJw!0SD2&h1"kVk)3ZIu'Wt-4.+mG I+7aipSNdy5#!6`e2}%3!x=W%Vօ@N늁v~ h*FܢJ,^;x[tQA _=N}p]\<)ꢤ@|fA'/N+AhM^LqoDXRZ{_WMhDC5DZv"s;\FiLI4jq Pq}6||'لNgZW._BܺNc!NW\0dKSL:4wo{gi5jCK3١ek9-)܌[vqL[wnu{Zsirˡ;my-սs hK^vr,!sJ/<]c4^ pۤhWH| i0TYН^y$?]n) G?h N[Yȫi: tGhMbunv50m4lfҾv hC5_7FCִ_^zկy?l^q]Lp#+҃h[KN@)B5C@*)M9 1' h20/4# -FZl!@S?lw:A-.vr!$X3G)c1'72V)(fQ&dWiJf#-RT+nY_f޵q#EЗ|? C.m]p\Gm)P+k=%Q[AIJvə߼HgQb##RbVJlbDy<5rZ 3he lt(KÔ);儩(Fqf&OuGHƱ:5 ,K7au_D@4xXO59 YH$0Mx,΍tZ0 DQ1!4 d4yJH( 3s7>8ġ!ji󏄷_%DwGٳ`n~hAڔH j8v4Oe\Nu,`\ IqA~D,D(]kIj#j1.'^4gD+-IJ_SWHF'W4X3,!Sb|:=[楽-s/f.7P[vL&$DiI)iI$lD!%%!$1Dһኩ%> WL-T Ɨ$:E딚=6p!Xtl8(P>"K1L~:-B9ۂ(K5$"5Cd@Rq0I=0@Mt6&OX;I25\d9Lj3 s_Rq1S9vJ"c)AfZ8{toiPo#^_8ۥB.}E|tXsdg!^Sɤ.shMM/nfŧ_>Mp*yt`QT* Og&LR]`6zw%F'XAIc؝| oD+N1W0Mb|klz;^#_.L:ţ:Ri"YrhD+%(&91';Ƹjڔ+WnYѸ똈0/-i]ҝiyUvO"t\1.~iUFֻ V g:W曆|,-l7Y)]<߸iahA=A'"VKlR%10 g.ޠ9f=֠ȋз1Meg$Q$vn)%;`y!~q$䅋hL 63MAXPT1XDEY`BZE4HJknmXG7zn<":cԱnmzL̺ZպŐ.A2y3\<)t:YZRN69U>t:6OITøm^M!!/\DdJ4neϚ !z WF'tY.|VR+S:;N;fJS', !K*ך5 k*bb2$'Igv™8ȷ:pDYs%?N0CyPŸQkŷ«V1$䅋hLQ6xOa(&n<":cԱn=F)κZպŐ.!2E5|Ĺa8z-UD't:֭Uǥm[`BZE4HTtzg-a a5ۇCRaIpoIHa.Sm%B `L.RR±.6B:u)fW{bepI޺ŏ۳IҨ)Y[['%bD@F2`6g Qƌ:`&D!4x {>kJGA1j#5=1\||E4Hߴn }fRr2{<\`B+gG.A2#M&1DȃJ1pZ[&B E.A2f7@i>k`rA O3Wdת473 P&~rZcy.Sakou!'Ǝ-[~LMyu)n&iN}{a%a{sޕsVo1Mr  ]D8N9X)$wP,:PJ6e)v R35&t2Zy9ؚ 5Xw<ɂ_\Q|wE|)'xznG8`kv6b>2Sjq!jU{""/,2 åDZSX1#֔g4}mj$B&D,zu9yQ}1-}D 3WKWp\ӑIDWͯkwIÕ}9o}RշGMq]% r"!ܿ9;W?L.5| @Oя?C:/Vg-G,0W}g+ '1Je嫿S.+,4W~=SQND*` ,TʌRGB"7,5{IڭKº25%٪[9pBShWd=ݦRU"zusM*LS* TmdsȺ ӊG.ðR">9x=+VtҪ cȎRPehuR|\nL]mY)~W1odz_UWۧT{TUQըV?*KͶ?"sVgqocPKqhr!$7.8NF GZ hi{]z9y5Tʔ%Wa#%U =BK,mL hyT5Y.3OVnnLhnMbJ2#C&M.I?\5Ӌ6[}Va89>R8(B(YFL: .r#;.SD,ԲqA%0oԑ&ص,̆]Kz,( vuè}#~Vd*WJq!c&gR2JRe;\XIR1vo0u͉#v'޷5'+-wxaϗ.X ˔sVO[M=hKW!sۥZyNy]폂ܻKwYll͏o|>9\Xn֛g- JG Q! x=- -<$zHԔrV©\dֵptT]9۵Y|/|ro w؋I[%Ouy[UJ_ؼ;7s7K&5|Iq>ܢ>4 /KbMzPkv$[W`̵Hi٭PCƀ007'Yyd_ufjtf-zAibz>2Bν0uE x!c A$Ov(Ŷq.ӕF@$v1hKܭQ[QyZFtY.;d*zw6_K7oY5h Jw|:`17WمkaːDklϺc|+ ?$G;@jOă|P{@UГy:=_ aOs9sSLU3Яs &t!c|KO Ӌ;wMunj^m-)MA_)Zn\HpVa  H u4G \;;݄̉GZQ>h^۬3lzg[OG\:Vv =:>zn ?PHK# [ ( +&5҄ B:,Nwbŷ;v4"@0}[Xm=G+iOGj Ơ}5YL̘`]bvY<.D>p,z0¼e;Go!ahq6 +}E[ed[S.ٵ\U&pنXbUn vU~I٢[Z"E>Y;>hV!}'41!+LB8䆄)-j!iԈؿo\h t 5YG1! "jkSpTumX1.uXRX[ƑQgq!u<5#C]\ԓf1P|7ߗxꭹ~ʎF@Ŵѕm4raaGcTʿ\EQ Wz/8ӛurW ύgN̿<=7K3-XwKn&ۤPl,>|"hW#4ՈhNĘ[6 yڟ'aV7nvo1'Flz9ް$ۧ=>ox6ϓ"ERma1ϗ|曳8}wrlV׽cI% }]$tA?~LLt8cg8?1^-B~=^CmT'!+oPYk4n//%AIM? !'՟g.\~zl[EƑ?~:: ڂ1 cǿdžwhZQ9?)0TQ1-kGO|qPm[ UK~,p^t6nz ʽ<<F}*lfRNs85Ԛob^6x7#A~ ( 6X(%vWr񳼈Fh/PF |"c#Zm)x F`ף/dߨVRJ>_^zsL2rD4L] ~/:[p„3 MFXg+0eynQtFYs`Rط(wyf9I-c<`>}.<Ȍ̵SkbP1`iBr8x6I3wS ܡ"YYP. Cr\<߲%Vn1YI۬Yk`mcVLg Y<\i30/?f`FѴ)M4ěT 9'DbJU0\qW/!H.rQ$j*Z}d)^@=8k_0~(wM4=pǸHʴR}&K#ߗ[S }lMʀ ރmhE!J׺YıaОtQ(A332+9є23E)n4rH0qp*;CgN Rkh jݬ|ءs9drzzO״PkEPM:#GYŷ}T׾^'thZZwy /Aٖ*~L*|a:@2hNWќR5T@!PmS3Ȩ%V88Rskcch7 eqbUT^P鋼9T6 dƢLA]͔a4Cn`v"cA6-TJ*Z8ͨV:*P,I "#Pk RqM1{f*q3Ls0ɣ[`X@Uϫ'%w#W32DNB*R)؂Qa ZCs62$I 7hsl-Z((Tͫ#i S]BG 1 $;\lwB [Ն$nMxĠ<Җ62(I\"H#a*vR$HʠD1M R8CJp`bX́JyP$\n|s !\l1pm]bdFJDdZ@x~KpAeiU+bEHJ  G+2RMD,nް* >bNNJ,o]_t569҆Ʌ^j^ =yfMz!jD6f$ab؛?6ZAd;J`.-g.7nU m\(!|U_ԃ?=~m~ ʼ?gkZ|R<= 3z4my*OInĻ2 COt?wTO:k1lL+ѳ[M^O)mV[ DZlE "SPU qUZ%l8wR@ QlM'$ !z9ۡd<)w~,:4S>Tin$ee.GT2A; K/ɓ}|?Bw476;_&]1~FN"=:ILJw<}?7,RTuf -'!{4ړASKƿypuࢣ5ҐYQNxNqK*95PpL_EN_B}UWO]_R"7;](^J6QO+ZZ1tmklR Un #[  ׵mX#GIޔO %RwZDlštF-, !T8Xnr.j'~exJ)*YeQ)*d& F aй6^xVWFbb34R*]Iemm'V j6 LºSVBN?4vXVs0;*:fZx  DY"3HH+ A'h]ԤfӨuN=e,&<7ù!D>j8Ȃg47JHIvg\iZph|Ì:@9?PD+tN=+ ,6k])e@8Nd3_p~|u TgR̲pw BQ-jl |Ms1|KGp-K|ħq]ΰ.8##"pW|w{獗C`?OapIc'惵(O$uL܎I($qr 0.^G% ?zoG5{ NWw'lz0&H&% [Zeۙ'Y}X1KQezFz{ bm[܏=Axwן'yԾ@aouvݝGJ:ñÐh EhmkJfP#Gs= 1^0k#n۪{Kx64>P&8Ђ4FD[PBiZeq/dN\yFn31eL٫* Kݚ7s8JPC_]uY#W\m&*׬Gg{$%)RbUV~MEdQr XZŶ!ٜE;[n7r Vr6߰2=οʝ>X3竹M Q熔 :{8Mإ)Y iŌRQ? ~]~ؤV tۋk;FXxs~aĔ 5G:?Y +Y{W^/PLjt64Fs +x-C'?rA nO D} lEu~i2h͐,L]/t~Qf4$1+o~:oC4#_yt2wA`{q kj1U>}~}J5 tn7&~ӗ * EL`K*휮HQN+9Ь%mѳ BrWd8{%t*z~AةxqlDn-eodIN h"f\ޑ r90|ޚD6qa8!hYd'XߦqZ2%in9Su;2άqJPRKKM!BUi}5z By1?e Ya/q-]o %]w|sONzLꤧ%t6(ՙAK3F k8G.M@Zg"d@F[r K=$5w4v1dCL@izI57GE9Sà ?YӖfMAWԽ4붥YcYhOѽ4(ގ\щZZ]j'sr?Z6IL꾶@R/:_ CLEIO=6"Z攠C0˳D(ߝ(=+޿fjuef6z5e/rz'c%kk$!{0jE mX b/[K1|.hoZ Wg漂sS^ _WPPFpz[ãI',EZZiurDO$t~q!#6aBbH> ΁`ݴj(2WvSdFcht e&)Uqe@xȬdEfE/J5LWmhH"'K?:mS~߯T ;,,;qrFsOM 7FڋVהhN0+EWR~}ZBP)D+5 w%Վ9-YY;+n&)M8;PIٛ\Kdl0#^x"53 g#1Ǒ*2GLӈ!{% Sʎbi\nLZHf~`wC,}Ќ8VN=$cʛ(| EM3I>., %v ͎a;Ju1LXb$'Ա G5.Ybrz0B^*osSx!'$Giԣ U"X[?AS9Лzm+<h1|Td]r̃Qۂ~++,k 6oe=wprٳ!:gqb<0X!M$(#9c I@3 A5/Ԡջpk2-^@[xW@_Gub2{NNJ *)UfqX81* ^㮨*Kfݼ8lJ7q!N& X+UaEKp5V ZsɯÌ{Yһ>.SU!qtL׃Xz9Jo֤V2WԺ )&QH4o^Bq郉EqQl~޾+~Z$ҞE641],JCޒÉme6 *ՉPq RAzzߗ`+_siH_?R,lNIf`8]xrq6 %7 ?Yl.unV#դRCXEC(q)x2qIK3Ku  3V3q;wdQ+Nnt|'.fCE|1`~7ۮBWecffJs_BxB*}\X""9"u*(W'aNj)pZ:c3̃eA!/5>@Ebn;Q+n;* m9ri-:ReA9 YE΂.YP]0G$cY.,(j$˨( _0[DxDXNJNP0ƴnRWWd%ՕmܬlV@Q`I~T`ۘ Qۂ 6Jl!Yx?Đ"@Pjv09M^p0 .p!(=qX 3 ddqT2\9)pɽTzFK҂KL\z6XrkM<E?.SK>%n W VOrk/, ??܍j 4_F/>=v˄q񫟮]a"#??C?W2vqiw0sRDTbwF F J`dgn<1\_dÙ!dsq:LV`v<ơ2[ۆ80 N24@4\g1WnShMg5CS+{bjA9A>w2AARջ#O6%@K@SC Eg&Go {Aq8cT-!Wx䐙/=QJ;(A PMF/wg$`zk4=V(.HHzDʑ2\Ȼݱ xO /cNyq/hnxc0Xo1W ʗ1@alGqɱ\!Ye\vۘme O!–x]{hAT\\ ݵ|WqqʗUtgՄV 2nK3I$QmEU$S<ᬯP ~c1.w,ãͤP*:ԂnG'ʡ;KhXܸI܁ׅ{:ΧW~~#"Lhk5IcJn1C10 \0%*I~O;2b, ~.^6y|GhRrcg1dIH3bQ1&P2 )I$Vp4c,t 9鐏'e؁x q ֒В$R{o!_oL#Nu`/ܦ\p;p$[&FZog|/>kjDa9_<2_FKї|#wyM+${gC" @ ;9R#!Q>C(܌@R4.v8pNT)zru*̣:j,Mw8 aŀjw%%A9DU\&)1P q,B3\LXx-re :{;K~ʙDY08Y2e;e_b=Mo'nwcU(ѐq·8] f)OztxNr& unEt.}{@:, ՒvQTtF~fQiutQm$d1$Ԋǂ16^`jGFQ+0v$p װV3bUF_0V8N0֘-Xȶu4^t4o >2ar+a0q YaC>vwLe8i'}Ke4⧎b'dG+ZkSߺe&N5;2zdچRr#)V{R+FPq Uq'`;cF`R;1{)_>e^>c.$}]Oç&g{h~Ӎ0['<~ud6V5[<1KZ};䫵bey `Դ5h|X9Nr ӄ2XCxS'_{u8]N&֖ BD?=oN1pv~r gGd\+ez;/y}ˎѣOxuto0unWY o}SnUҎ58zҚӴo>wpIQ <4fSvJon}HNh펧j\f#0z@ubr'8~'>"Ъ0le{G>ewȏ*D1w0IC i;FWǴx ' 5]RЧ pn+w*Z_'ϳ>/LaqZ:٫0[*L;&8i% 8=G4̵ 1]}zSX?e1#^LwmAQna=8]jWޭOn98B'8eeV#NWy獱Ob_-"M>ul*=e`~=&N^GJé龏_ػƍW}Jލ쾏w< ȅdlnjH$E)[OQlbԃ,]U{x$\)ngSlҾ^cAZ.nz t 0)RX<{jT!1Y6>rarВ"ږib u.(s.XLgy2^M EgWdGyrcZ8,?4NEg/. ]>V&ΰGƉhl(z_"ܻ1RB`ڳ}oW}˿1RRb%f"3FI֣kX Ov%5Yy<6{pUQB͉ cg\č6H΂ARᤝdp;5wپb,ǹzڂ? T:|쿓O>g>^y-bܬ \(*qeae9T,_uݾOr+#I dH+׎PE +#Y0wS겒aߨcnF+uɀ.\c_/+B=e_gWWEC I0?M[ۙ/8 m]j >go߅P>N!d[oްRD鹇I,S-im(^KК3!ڵ W=6ٟ.Wu 4P0+[ 6Ypp. -ΔJCiVlTj`pdAF/T1fe>QN(R(\i.3ĈtHha45,TaE\B" v8.ڴVCAL%dqe)2wSassbaxűXqgTAA ז{Dra1(/HPD4_1ek\Zt_ J K\ ].qJ+?Z9=NQ~Xw]1}?Ӹ]̚y{zNo-=ds{Fq;D8)h}Cc?"\JH-@pX{9$nȩ%G`@[rXB`Kf:q 61swH;`5c<Kڢαg,x#,Lx΃ haɥ@"׹ᠵ q0Q H90+rdXp\@1`r + |F/">(} /y%+¸`/{gw߮j]\\;w|re{]Φ } +jRsGK$Eq6V!8W&諸~(kô&pbrîdx@s 8a}o~s5%kRE;sC=O;ҷaJ)"@ 9׊E9o'w:<?yΑ99E]}xX_,+/X_,] Kz~hjc]GKǾ_<,h|~q%b G+/V_9ZXqt~hb00#L0Lʑ_QcbK[8a*xwxb-/փ 5>b=h<}XXO/֔_>ZXc7zUׯx=Vgs5+9W|AZ;ado: 1ŃYK _cdY|)RsZ]֭)R']Г;^̍(J*[][Kf9ti/R˗W-}R5; c;'.ήMLjڀ%@Z2M?ݛ4Qꇻ):ۡKo/uN/ïj:E&iaی LuR @ދHWc4޹)JNʼn$]+9;`׺eD~jWވ(97W v$ {O?GL)ά£w41̕E>c],Rf(_].x=-Z'ݞxxƻ}^B;ЧX@][)7h?v5SVD}O{Q1M9) ٫a|tVXˆ.u|ZJ mv3W8&XfGi>j&Y%qQGi>je%cfQGi>jM9outIƣp:LQSyC]0~Xpi>jnK4D\tQ /oK#I`Y* l ᬈ凙{YVϙdT9,la,w:e._5eeź*->Ie>\}fń>/~U^V u:w"!d?7u !G΢1kAS֟_ovkCEx߻ݪjX N;ncz;*nmpȑhO1r1GMcAbPub߱v.n[дN[|ϔڭ 9r/Ƭd)ƜbmJFZ/Ƭ")Ɯb̭JCrt1f)Ɯb-JA/L0F1C`b3,S9Ř[bKS9Ř[GףFN1cnS"zt1fB)Ɯb̭JB㼣c Rs1* w4`MP1sD1fBqRbS.L(,ŘSUI`M; ccN1V%A93U=vUM1棈1S%GcfHcN1]@j9 tOa3[է ͅ~էiջ>0bsr&f8˧+55EVKaT p vN xKNR $Urc줘 @*/[қsbqb&Aw.>̧C@~u M0)59<n3UnEt+3ɹYl"Wsũ4 ~o B(Le0Z4`0X+ႂ gUR5Rzv1MS<((>lA8単eWf* dnflr,O)hma|} bc[s$6OzrOR;LD{SrO~]hyTιa~UZ\QpȫKo t}-W0͖%X`AN1xp8<'O%%-ʅ&Z`x4P w^ZRF*/O\!%RCRs˥؂0LAp N'<0S2:~qJh|hg AvprG]JB= ?&KoflM3pD(lK{@O+..Nj_}Qm]^4=YEܝY&B(e>LCC+k O?`e6&1EtZُ?JwZHח>_!yKZ.~gqؿsScjbzog!1;]"Q~7k35` C5GZ1pѡ-a]{1{8?q.nf(Wgs6Ksj׋]-_N\V2Q^.M`v?nxfК7bop {}f47e)('"Frd{ \c`DR*n$:O}O_V [. 5PBZR,6ةO+}Xn{# >mTw%bDO/#LCȳ$ܞg73Jʅ :i @$t$& 3'iE{7VF(Y骾?ƌi+ eId yΥj=eM, bRA Mf q;¬/ʳ&MI֘jCƐ:(jz o J_s_a2a\,rj\|@ż ( +&Q&D&?8kJsIPq N rkJfئ52*iJ ʞS4le%NVb}qSN',;ef$kG*!!K&H92DpOƌ8NZ"(cAIG0w H% iՉ81>:^2 ;}ȓ9m:#:b0k ܓDZ:]Rpܭ?;o'uȿa` 3R AJ)&#I1h0iJ(^1E#8H" N V!6JO^Y$* Lʄ 2Y*7"j' cFp 4$?hrY^$Ag@$xJÕA-\Q Xz .4JyGH=_~/_͌} @pV*f8[;ĨWu6VK3J+xDQⶎ6"H!Lfd+JJgR@ dNN{ -aAk%d>c(MP٪J0Զfr$|V}YamēCP8RtnŎ۶UuXhyț'uK>i3< !J=;t:YѠpznYy>kstcH1oχᇃn>g{? ??Y9k‚Epӝ#{CtÕe%Db{a~sl1Ñ7b0!zB2e yW  lԦGƒ Q8ѿgG9.,lΎ\fCa^~(40g2M=#H$B&B0 C)۽|5t: 竱kEvҫbZ|.q3xS50$x: T4k7MUAC`x\\ՔJ罃k`~ţ}93].۟7k+I(גՃ8oAͿq"GȹK`$m߇֠JɏUVczլX~>-0"XW.o]ۤ2eˬ,G>"@j%*a =;/H?ɔ6 B&CPL5~'{Xo dgaG0f ;0H8\+O;Lx)j}$%#AaH=u2o9cA wSw/ﵘbZ>Q ln!5VI"D)kH:rBk(|TBN|uՃK4G4x,^wxbWLg,oȒQLah#B@*,lPbBq9$RDjs[+ 7`%-3]JO7Xnb{͋8ؼxwE !?Dw^%ćYؐ DŶ<ΰZ`!&1&x%ow2H1FUD^C¥9S-mMr>S"\COM"|3 |r)vq< H\⦘Kx=֙ 2aA -IV]f(?*Z%E|/5`*w~IKPВuO cٞsGs-$~)͉DjC *!$%xsQN1ƅDLhQX";ؖW\yd#Ӝ#$,QA5(N[Fh % ׈ZD W(! :`1 ؀,Z]ܹڬ/A ھqj5l;s!ffb~ɚՙ(AՍnc]zs,d_WgʒJޞS՞BJay,b. f1VpJ[r!10s(]A [˖ oVB#BgvseB\}6;[/.:4Bln5~Pη G߬sDM(ff!{*/쐼C'eP>@;>  IٺTm:Lχy J|F[w4!SsPwbF/mK˖CA˵}{$6ё@Gjy'^>{d8­I* =}^ xKx[8PG6cam+&ABX3I@7~U> s-.cʚ8_Ae bv-cb=1k= Dl^Ajd"(A%hvU}_Vf֑]~Z)阓BS4B[xZdS崄cN;QTR-3*6Ճ{bTuw%O9FS8iy9[J7#z$/Vy*8npzT)CF T PKT?"ւV65rK|!RDNůu5PΖ6^=QzAFÊ~ۼd| ʬOCWO }&4ۆh!e G~U֐M nduOD-<fO7Nr9+-nFW=rx}E^؀. |pMfua'tWRNV x ])9Y,hntO紷ф\DiրͤX67 ِ@w6zD*OX&ZJ-D7(OD4'B8|i]V.. Srq%98znz8+gD9BH ô1H*j BQ*MUdŵmU.HF ky-yu1^8{Qd-QZy@8oR R, ¢v^ȜEAʣVZ#JL m m|gA xsQKۺ3p`\R\-k (4.^XjS"zi&mu*#b+dH$0/IB%FM1"OjoK"9JlkI[ D#YaY#BpuʻS#E,%A^%>ڛiPj( CdkMCTK:"O&M DxndI^$$PG-w4l": &:!t"YϩǕSt'޺[ g(5FxQMpR$ Q8}HQm *ځo_ts`̅T =P^1X+tXT?, ?SXP]:JZ? YڿPq; ( X3YXyZ"J2DhFޑi cxAGfhC]!f7[ $.]lP ר"?˓oκke"f+Fjqل:S)wR.E*SC!%BMD bguڀ_chuΠ"qdsG[f2o.~ĕ3(eG2JhJR֊p$"@(N,3p4u2yOqu2)JE)mEjكWSn6y798ejR x7Ef@PY ]If( Nsi5'FTΰR NcpO MFtw׊@j·b7I h[$.[PNEGLs*,OW+ $NQB%F58 ѡ ґwBi/dv; ꪅ+be,FӼT|@.؈B ̺of7@Q%ܤ8D#J(iFKʙj,nx@GwUyrԨ\qx.Z(whZ I$bbV7#/-O}iK ѡDmϩs:da YsX2F{!$f٣5.) Ww (ęIuVɠ'_Nbw\YfY#MGk= .^>|n{!Y}[II;}qq-*=v-r\]S|w߿k~(Q7~;[><D4?ˎjѠe}7,X>\Ch=;⻸? 6^\,t'fSkfzᵕ';sE'=LUUqps?9.?}n/)# 糋A=~\[m]J7in7m@6ORjxje )>Л$w dQS㧿}s9DzPICHZ[G&u7g߼X0sjVЅ\ԤJPW+ZO/-ٍԢWv[^YG\Jc9[œuXA)-=iU5!Az?P[4_.Qعʕ+sb z7yQ1o`\Lwv~b8+!ܟէE R1!v0cr >gyP8h:4)C>DKo-m<"Pқ G15H}Ypk}ɴՈcj<Ll y< "?[1>nizHIf<B,YS01GQ9LĄ_DlÕ\)5E7ejHT!V);f&!R'@8Tg?p> 9a*9DAk@,t28?Ʊ,a9B~ Hozj adHa %l>rh^4QFMmgpdpkw^p&yL꛳vz<3| `%x>rNFY1VۊlWGͿ8gD(-_bPHkٷ7!am Z6oQjd~K=e(DMD0I '&&FA5*dQ$G΢b6@/- [kw}j{}@OIVcLB=*ݮzL^)^ ~+{c-Vjv-n>҇!T^&izEoi!g ziIZ~#K;;dnlG`7w<_ )oQ-l]eCOӷ77owut4hƅ;sf4j ~+n|7 d+дykV֟;]8#cRly}€aVg+q;p"9̽.w[܍}DcFdLە͋kzs<:&P"Yf{ R2R!UX˓CSoBckocv7~! }4qI0@{bdkXr'(ŚdDګ4G#FsSJߛ/8FcLL*(&(alUI8T _MD)/QF*#Y^ =v l}k^+7,CLjɖ|)ďt4+Lz=s WɄhsUCdl]aN,Rv<K|\ԅ(iI$WHr^I`0:NbH9 O+} U V#*sC c&<!VP#!4T\rnEnpE%K~5[kW0c#j/.^F:g>Dѫem*zңG)=#@~|n=YQfG!Eo; )8Ђ$~IwƼ1ϟ;s9JCug3pvW8iR;:ݱpsjuXaZn.sCɄ¡È*Ŕj֧7BǼTqb2Z =>\:_v9V׵]+_|=D^ꦵ|Л,jAy|]CII3`ZByywf2FLHԵ>LCQa[?){?^4R)Jc3*OpGz.mHP;ƏӜxYA=?G)ww_Њj~#]Ԓ=]@T[ejEh; .jTkVV{m-I-f?bOW#=KR;Pm=CQ)%`>FaiFXW;Grv~6z1UչK D<ޥn92o+o<'n|>K}̺ޭe}եF|@&fuJiJWUgA04)h"rDr*pK;\;k-A-i!r /thp~4jbięS7]2ꯎLyv81/sZ쫎,K=ypI]ʘRuivÕaHϥRN&?P^ORWvq ^N?ybB M9 cj1${k65doӾ?L.KB<خx7RަK%\`Ѻd͇Q aN! V I$$3@orW-m|M*ʏz3¥ƗFք3u<.'Jˣڣ>iodz yD\<6ɞ=Ñ!Z8<8ٱg.$'B?W R-ʛi|ʁ^{!GdPIR+ltwO\˸!~=Λ㼉=Λ&~:Jط-kn%rLJ DXHI !1cPIR3}tzS^ 6t1x_ 9ZZ/C\-c< ]5WѾ1y|kVπ|8ζg샪gU=ΪzU36 hJ-hcSHpJ$:pNDF|8j۶V3m;7i6^r8YR&\ ^ol;}#XFM*A@U Nɓ> ď镫8J76sc 2@;4vM'96zit9}8~qqg}=%0_i "%CIGoDH$O1B3IyHPh@m96uh*pnl *ELja2~_Dt4Kbr\xlkLD]hH"FS^;ޡZ5ڴj "%tǓ3v"ojwmmJ~VHr2>"9;O D|YٞL2Ŗdn [-[IՅUEW?,@从o\-̋G~5ɺ 5rWtQsJ55?zVJQ5GϿuryuѿ}}ƈ ~:EOow{,0GkF3l4w˚_N0DAjg?_]PyZۆ69I%H)1n\] @iDѸ0i,d%.0t(RZ,9"RrSMIFW7Mio!Cކ[Dۈƨl-KlIM1*!%˜S`zUxj* {w]Y]g×/fKX#?Vrhg,:LBɵ5i&ѷq]Lo^t=rPx \0&Wt x{yZ݅)}P9b?Ӣld.x]cp&S $ԁ x0Pz'%уgI1! |Ŧ`C5e ^+[ %O1 $-i{`TYD"\I#%2ڈotԳN)c(z$)S`??pB|1OwPR$B"s,AY4K.Za,&E8Ywj 8ucOW8CX%V껅wۂ KVm .>+˅j>O0Ô_;>6Cxr )aߌ3j+X:Mb}cG P\Z8?Thj%(JF8Gˇ["()Iu"@4J0_jԄn8}L٫pt♒[y@}vK5D m_3@XYBtJ7/$cA*OEʥ9X'tȑwe˚`ٲSs2zi?Պڀ0T6]nG#Ǡ WB{:At]' Q/-bEK?U!̲@RgW=/B򫭼;&@t ɘd%L(h)BDḲZ$5 w1Cƚ}mPcۇ=c+0H)OuD$pA ʕpuI',E/hz qz<A%~~D0;Jٙ E*$Xm\̝ZBdֹh].->11%beT9]}lߠvb:qS/2/JFxxx#F!loNKf41>Z8X1㨌:%g³ ǹCWRTu jX.$ŎMajZEbpR0Ԙ]>+Ufꃾ4%(cyS^:6橣k\ ?{Oչ* pcIe~[ͯw!>8 {1@6.!d‡k;{dzT1$k!+S&١׺:3OxS}t<{/ I 靫+rU;ޱLXFNdU5uIw%/bǂ:'䰨vF>W[ ,N-Y?{4!Y&~DW*}mcw-, 抉st޽rG+Ɖb^ӊC/ [;e~QùצOi;ZȯI|޴p-#-JjoLSZo?wTV0OHR?@TC&?>)DVd,+%GjOr/LL'w]H$}cK9s5.ϗ&b 8G85ȗZ!$4w+fVHHeHޏRV1)ĺ_6w,?Zq@).|ʪ(H!Ё(Df.Mj,av7FJ(}mA &CTsdVЁ(w L +<1NVD0{w/8' Va7+9y>>E@GZ o!fϣX5U=20]cn_ɂ4!/jizB^2RqԎ<=e[ " }H46-4;ǜT=5|ѥWnj.םoƷӉ MƝK"{#E#] g >щQsұD/U)PBARGTl~Iz9-y'!Ef8b|%[}If2S PCsZ`p *qB ViM# ) -YPV5f28 U@Y$+A+(p"XJj"Ջ|#9+S*.Ik̗(l$t,4$ϭCWrAi*5,M\r:JR]dk+0)#tPux:"l9ACPܵ(S8@%K[CEbTYK\%Q+ȹ hj]Zy@cr:W<wn4SC-VeۺIM3?ۗO'%z]>ߛBԹ-渦ce49ɱa$$nTS'՘JЭݹ̱6&^7]P 1"COu`L$oF7k+C J:G8Zj ǰj:Cn ODҦ:$x@m*Jpx$3! 2-ᴕVaHƇbvzwL+&3[Kc|d|\Y%<1GB$4rF! :D,n-DM1"p4;y)'W/?SJIkEm.LO$Yz"#h.c5l ɜm Pncq{makb)p{|MYM- d؜cD#)f 7_F:3M:b룩".څmc*QR L;X__[PϪUTwgn~ؽ@0GiOI{llOM]V{7n6#Tn;J`P"Ae /߲P~٪԰0+w#6,ޱh?)FХOً7A8]chH>?݂c I6 {\!ջhw> X)h//J)QӺ{j^EV($qJ&LbToIPKC; 0BĮ!1CLjWj?\-HZD.º ?v׀zz3OUw}mF_Q%ՁƯЮڗۭ%Oښf$Z$%$+G(bt5_7htGpZw1ͧ_V)Zv:$߽K%֟P PuZ\cרd}DCUeaT ɵöi+: 5g*?4sOzkʄΐ,hH.m7~B @58h)Zʌf4D3-x!bOTj/i=B%5w0=?rC#u%Pv*帱`ݫ/:^_u >q)~Wx6Ãkw gf|n)۫q`ԗ48d9m<|1ʬi3tF7g3;7^1+]q2)dQ2ZMs_MoYwJ~O;qq#ۖo XB7l\8o>',0 R*ҌoQZRF-:,߶ե)UT}?`kPJ(nޞmZ4@C/g6Fz!r YlcSͰ[k̘=X\;ˡj9[*:Z?Oo첼V;^;3fۆb˒ɡz5y~=ˏK X{~#3R_9 v7.4\aÿׂQ'9=*"nRzڧweā$w. 2%Guvm)j#cm 8咴+laQ_F}8LGKq(+S[k(WnH͊Z ]xQi/ՙWH.c4[2qijL_>r5n)B3"oˠSazzs/ap wW Lxz5h ua]G|LyTʁۈ5; ZB1<,YRYJb03$覃FY 'T 6nK]??o fdL_F􃿞~"@ʚ N%w&W&%4|[TKLm|o״Ѽ}"0(1]{I5 ;EZ"mKhN2 (<>`=Q:g1ߑy<f88-8Aa:F/yK+3F"tK.QqYi>"&Wӊ[Cfvi-_a;^8 r7\op!ՉW/ %S X|S+++H{XIVp-:5 /m1p. h Pg}=}J#PIn0<ک@S/I'[che(PTZ+骗oԕ1PkךϦui '.sr)'p0+RRe ޠa <^#1ir uD*WAJҊS1Uӂn3dYr6@-RګkDӴ΀)hm@{Vҭ j Vn4`]EFR߲y$&>βRNzTDKܩ,.WbK-% CU"ryנJ-d)@@9gZiWnI/"7Jo=J=bn_y&Dʘ -tpx~wA fB 2"XhҺvJVӧ  !/fb/2FX 7:tjPk&4:lKyHÀR9gl Ѐ̽_#g~7KY Ru5!6ʁjUJRkSD-r{K@ŧb0%HlkE yjb߆~qjҐ$,&vK4ye.PG>>10mUQOJ0C,= Cz%һoB֚28 ryCc]:V+S?$!,$o8p(y*kM;0Fw1?bFFT u]%XU;T{I*f;NRm8ZtO0T9bɈl 6B?NcHp#ov9=oǀ6JBKiETneIM%-JkNly#Gcxi#hcv֯c/VS6Te;i+]#ư`pO7/Ç :UԷN(Pv{dA78FK Z[c5v*ȳy;V3jJ)UKvI`ʭ:{`fޮʫ8x6:=f:OT{P?<$1pĒ0z5gh T C$f[ו%[hOu͉!0' oU!ӅrOn|f7> G'4;ԆT»AXaL[p/;oCowik/!]2Y2KY1>>& ,Xɫ楸Ň&g?ʂBUD]q]?{Kgy]D7өK8Zg19+\ˋxkXa@&u+LeQp-xt찇{?k+R ax8*h8:迱nvKm~roi3nm@[ۂ["~ ƏDw? 8Y^/BV7Y:heO)p+Bka}w&VE2bYt_`㒌/䒁G\v?=L]aZncsb`~E͝zZ2/:[I&Yy} DŁZ/wW#p@2ȞK~aahb"&_IĖYolHJ%Z)HUɖw<ߒD `%@/;s]lͩ9:C4NJUVZly#;!Aqq0ɠ S8w8)X" 3| c2.>iM%~ui#і@_ xl񩜵ADt`)32Ic]6.԰&h}kW:ywhSZMTue~+W@; .U'P*mI^$P*sGJ RZKDO$"Tbo!R%PkͲ)iJ6HG -|Z5Hֺw k-0]7Q oZ#q#P0^ت׎yhX#Cp ƀՠt@%RL:OQ 5Y<K>|BjW?}2cႼF^)'Н!z}ny4&20Z t%d&-kV)z5ZגUle%]Fk L<շL #,.Y>S0K{Məq抶V1ڽD 96oLKjM f%6K:7IJV 5CۇX4SZS1̔Wx W ac=577ww I e,z5jz7XkuJ 'ujHE_)RFMJ Os`,Dٕ4kh5G5jxmxR"CթUaHho߀sjܸ =XBLk0wR;m̘"ax]ڸW҉B^-7HlVb]U?Z04NzfUJW=d3{9eBdr1h֙VYHy5Wn-DCQʴUۯ&ƴ3UڃTpM a %nT\@N3Z<nj e"1mķZJmԭ1'NQz"HQU$Hr-9.ءBsƲrYV%&Y"2m]Iıj􃬞(M;ZéyIrK'HZmeh%2ڔؚ 0pg:Kߵ*DO+ *um}Yx!k^+s*>K5jÂK[a>+2w|J44|\2/ΎzCĹ7/݊r]AQ\1oR.TNк:[RU9i18|3l1v[2,X24^@є඀-Y!%V.? IoXB*!SԚWc ߧmIqts=`N8 Gߡb+x|{Dke̸V>cNL[$Ǎ͍/'N`so27flU6zJzȢ4-"Hs$Qo|] 4VF=.f쑚P Od?~?0G.,, ﭮC_4A#3^P6Cx/FY*J ÌrO[Q]|CDɲt;_Ҝ.Sh"hw0C 708bX4gF yn?ΜVX[.F#VX $p`0ƃU`f|I颯؉Atl0mہd|:0D 56;PhS|ڍ/-Qpl~o)1XvټŭJoNyr$6c1"+5&o. iݚ\YR $MEEVh*v<:f FlxXS)R#ɿ+X*wdT)C `E4L@W|c0 /K 1sA녋N^Q̝5SoDZ'CinCl//\B2kH9 }m)GLXBr]ReOca{By0GIZr-(< Z`@ds`Ly$=һ`zMB V_Z6TK6K)5kdN:^ZS+,Wgp(LK`ш.SdQ%y®+gC 5tb.z&WԠRmr[InPBY;]GS\T~•bJI^*pH`$EFag{Fӣr(;vnX-XBa9D6JE)3Nb Z<(J4w5JsXͲR  cecbUr ZI5B4k/0{3Ş(_8\uzO*ȔR#OP0Nixqq" 1RFHuk6mӻfˋR;kZ؝yB>ۨKziR/mT7*f"py>`^Gswr휟@kK\ < /T@:O#3Lo`wv!BZ̳&٘58O_M`y e#`xx6w Oqu4?<;] Oܨ96v `ꟅgL}'8&;(n<9{%?L?lWurhs 1t__s5O=yyݽTc7gӷ~~`; ̗ .04/lqTn4 $p: ~eh\+~쯼߆/kWk/1/nag0\| O`ԗD>BR$q6|v91?axtay |^tw?>.~~u;-SOFF^<>z}|<?ÓF_OҭE;#gd͟$ Ө&@;>;LSH(wY,z7 )x_Hes; '6: 揗ljB]\({ AbM\}蟜 FΕnL s@ҷsX}ߟoXNqN%cy'Pvz 0 f2BpBT`&DP ؛!+c[|u^X-nu[|:|]ލ:mswLD24 <1*',D[ ZQC;+(puA CTspǢƺT3ͭITڿcIT-nqcalqu[\WW/WFxM!Xfcd&`yXk,dZ&l!"9IŭWR(Q\3MRȘk` |/޷`-`pŋP_! { B[km\B gұ4m08HۺPtAu:žҩKb7m&qU[%Hd{w` R^*K{W9P霡1G>@""cHc៓YL!S~?P͆x W5e M6ѼsQȽC24aT28< r=IcTFG"xT8-#su.<&Sgʶ΅ֹ:ڕ֭UYd:&$p[$" HECYyi͒G7K,ytGv /Ur=3]V~џe,K,qDEUI^>5)ጊE150|ڙ9{JS|.Z4d?|u k(PƔf&0!2;K]Lz%}=):2[g}lqfl` υ @Ɗ3¢IWȢDD'Q'㿂gc7xH42\/WRcJF,_'"lR=V`"Q:cL9N3si/Uݐ`$u@)xT3=& 4F/8`1ЌU`ҿrD.KuKTa b$A" u bĄv\);{X|Z@?c^ŘޟrŘh~ۍgZOayUcX*Tjk 4ϜYϜY1?sѸf믎evؗZ24 /47Qxeo%ƄXA0ׅcSYJw%5ÜFQ _Z̻kmHn0tW_ aLeB_%%J#QEVtxD<%Y ĖsUWW} ZȻ1 -6d֐;32h}'(^C|Cx+,7Q$֑]Y/ȗjp Pu{霱 ֟܂h C"/C$Pp(nKCkd'f|AR}g1hӺ_6q&%YA5;Hjn 3k@;,fTtI2>i^\~[mUbUzz Gi@8 %K^(]2zb9$%R4lׄ[|2y0 dEq/ڗovubyzV5tm1G rޗFu;ڄB`OxĝUZSQ&=w.w$-Je-EGܞ&<(NUl .PuWX%{owT^~+>T'+?jv}[+h-$øL)Vy dR$*6 )hT2Y{BE-L(h<}BI&yr2}CMc2g]tAܡ)%H-2oΠ('jѹ(HO̡1=wЌCv޷=_qDj_ 9k?N٪!_oi8k삫󯃋q{ЖKv_#j/ ɾcd) s%Բ\` C^š$kiJZ%]:z1p]Mxee l=U1@ ,7W[ڻT[pwyFԾ!J׷?X˃b2E}V'wDkImvHE[pLZlTND4{d/jԫQg@pt{ )ﬖ P:Zj nfL?ǁa8v_ϣ..Җ&n˒pC5NJHIQYjf-V VLhpZYBZdY ݞf->4k-9$0B 5v_?Zյt4W@s\q4;;oe-uT68>iJ@ꨂh(5!9';w^|ciVE\{Gqj]emQK݌O,%U]WxJ/$V:m;5#$LYQ+ .\ ?&Hy@Э.{|{?ZKА}4|XS!QB[/!zA%D0V OdvϾ(OQZ n()=#تwoya[ai)/:EuDj i]Qǣ.JFȒ9 Az&fϻ=^]qڰrz5Nwo_MGm6I!zVՃ՟}<־ĝe;K]R7'tn5Ֆ>4'IX!ݙX 6] c 0 ߵb$MQH[^߃5&lh>ņi-fs;I Fs WU7aCi϶QmYlholÆ֍K Qظ3omD;S1frkQx~QmW%E}G:u?Y<qJsks&&b2"íTql\d^;JD\WG#/uHor1Qhip_j*bf_jrx?RõVRC;qŒ(m%[ȧ|J ZN5QK4Kq+A4k_j8t!T5.\`i^6'v~S|v%)8abT?5//kMgᓷ|;9528\L M(\8L.߾yQkO4gI,~b 9NPu*늫(,\/}lmcy iP877o+8<%ڥ$kbEgS"G-GCJs X`)_ߜMf'Yęo/lG^!79wpvS6/pg}9ppq-̈0 ?٢<̡9OwNwB ӝ̫;!#Z7~hB0BL؄Ƴ%]>'rx􀁦(s뭼c^sFak;M ?])FO=RŭnSO6F}kSsmKt[}N~.wc~즫f9߿_?-pTF95Z~p ޜ t={Ď-qPziTK/RӜ<^PXHq S1̧De6e~\0˳W7۾[mwΖv?;w|V;g ;1/Ehk8W(y^7w)kFB#0{NIl4y$> Fݴۥx"qt4k`˖v[1J64-uRNϟF1'iq:1TFÍʨޑSz\ 5(~+! ʉP¥»l$YlsW}*AonѸ~߮[o{0؝v ":Hwiy'|:}e7,؀CȂx~:ˇ_vW6xgFt Uyw}mI|s9<D琍W@NjbVcZxm@ꍔÃ7՛Qx3J7]fxu jHԧC_[?Zu4tOݍWLt\1qDMLЍ{zl^!lҚt\s/J߂^s+d+vEq|yK::E=D8gR$& aUmI TH8+\sJh;ߋ< 'Y"xYB}PP(=8-.[‚l<9V̀#,VH c)d _{KG\Nqw~Q;<hvn~`%՜Eyξ~_m!s6Ĵ'6z`jdRFqbG e ]$;K)^!'0Bȓ"A( U^[MTeʇ0ųIѹb?~Td@u,ϻ]@A L*LO. N |$GkO0Bx-ZdF[v!qDt1ŵgkԫ8V)wBPn`%AX8-s"ZAfju9MQɩ' ur3rw!7q?V\=,h[jeGdJaIlR"di44|W`I"C7H pN DAEr N8KlW@l6滒d.fohq MBʡ0#>_'W$0n A68[(8z7KF[;bCϣLK;<KAƾ>d${.H&(udd -Ec^1)P%*)%#  N:[IF#*k Lu7'?,`qq!؋} nkcI$;v~}fzfL;'f8]êSd;m:\9۵¾e0p ?̽ YK\c0JߎMQd` ӣICVWbлϊA0c[1IKӲ3Yn*̘*w+ :H*lj3Hit;0` 3((s:vub<m&lq2tHnVJ o6ƌ9{N[mRϮ b@lMTɧRȴ>>*i ̹GwhӱzeOJk;Kŧu?ola} j$iRA(9Q76ږc|(`ҹh$R9>88)쏏i.͖eߤ]=cA)Fٽc"=m9`6SmXE8<lq~ bpT;^cpTS1zއk9c_죢Nuڶk[]!Ji+F\' ]LɚdWs! 4r"ϘQ5md#!&;Rb"5HjCOm}HIt-4DgqWPE@K:ț5Rd2OEέQ-LHM,Z7+'*/|ɵ#׶O-T=*HmIN)pT5'n T6/t=]&t01hO (wMYDsrsٯ7t3 m\ sVQ.w`ϑ vkO2_t;0A3+Bc$!gX.:Oΐ DE*0VIWt͘Lz-7q(R ̴:z-)e09%zz%Ձ"rYX(CmR*) xn[[R;\,V(h5"u;0&^O j:"4E[SaD+uOKNT`9ڈd5X/]j쀥 FqU0r׵|ИF_㼿݂w!)Wek$\Aev|<@3ރ/ 4)uFJR6/M(5i &PjOԬbRRFo(:oGlS|?vhL+z7ߍ[!ʃږ検=7bF3"B,tM@;L il/6cOPYcTA$d˜| {GVBrCc"2:Ic$9nHN%e$p8r~(Z4&#o *p [k)%%X>!JsjLޫv`LADV1I7嗎 v~nݙwұRl5O_!~d gdҲܲAO/-[-+, &7~[VWe^dI⍻Va5"/nE]+ 'x'd]!,%[w|~mu%g g7_,DL~Z>Y5:q+F+1FOJj2螁}̀~Ϲ0+cOP7?\9B6lRB$yp+HfKR*2:zk~ lD_UB2GRt!B6gx!淘A b-V##MQ=J>mƤ^''s 9rʇQQ*Ds%4AÈhqhF.0XZ'Q#Z1,toe2rX+Rs &~=XjNx.>Q!`ARff\YH% ֙VRܨy[&/˃:};SuSliEgTbm׷ nʔ%n,ukh` l{!vYX-s){ $YgM:̙rSFQYʃ5aZ`L%VF-QL*:Szފl]VFVt^kU[oɹtonw-SQB) v\)Z%6f+E/srkX"F璭mrEs;OTsgȥ9kVCs7_:ʍM &Fx[X9M9ݨroz2əqy;f{$|0:FpBIb4 7iE=C*Srl5VwsᗫJ!4i`cu:"y\G8o] h#u&[Tɮޫ;zg; 0 W*z@/?qleԿߩs Ā$.F+6>]5wTSYNQL h(S7k τ𹕛Ȧ Srx}2ht=C&t` ɓ$Ill&vUhڀ`H=FE礨Z{N?)b3{Շ.ia3=?x@D}ϐ+ ,!x?໾tvuOڑVCCzv֙s8H-[,@O?Wi{KEԩ@TVFd>3_,e0 K1N LKAr='R_1C|7B.|J<#b1W_C.(hnbklTez'GVx.y@| Rę,˭ MB2!TADd^K~+ J*ŇhwXv`L|8zq ;=wKlj#c[-2wE["n*D½R!U+pK;H"_p0~AjK]};H‚infnz\DR7yh N"Ǘ:韣Nz4yi0qM_PӣDӧy,ϔ~u8q<Ӣ3HR7Qj}h Lĉ3y zx܅8 ȉf@8C&*![N_/Y7xl;rǙ==<y*LT&ܬ(fOT\Kw{.]7!DFvtE4 VcΜ2.;Vngdlμ2KLc.4Gl5fCm\bfi <y*LtinEM5g% XVsUUOm4qo ǝ+Ƶޏ6s?hY^Tvxr:w#%+MN=}=HͷYX?L~ޣ;i 72NK&6i+u^N(>({>.^ɏW:^JDŽLbqe]xv--`&˗_}5ʭ߭|[Z;~#}Qh-,\Q749+# ~lr[//&?: sR~6^ ~{%]w7oݼzWDt߿x}sj%+oz)5پ$oNnۓX] |iݵ@2|5Gv#~T/)y߭Żmvgezq칩yD̶֡IpIM𗥜{zp仇{~3q73/sf6L/|S~J~Q6w|p#jw|Аz(%3F)/ e|VS&x%4)\6sL;3(|Qٲy97}P-ԛ"x˫ Qk9?Onb~s#퇕;:<7v0ǯ_nnd2?{?][sd7n+yK*❀+auݚ!y4{$mbn<<7Mk^Fj>? @H T 6^^z[YE-8XŞ2M¯4J?/lDđI>sZ~!Y 3cybsYޥ1Q  }uK|SX 7Z:wvjJE} ߴRPsMB] oZ-7Pq`崯8Aj# V׵ӌXIƼюQQ+@abMvzgA9CE$#&}H4"wf"~<q"wLWY9Ega5g^[.9!q N[m"u= ;!xEbaoZ<>cNH&n_r[[σU$()f_ ;DIپ 1h4K:;́F'{8p4򝪊HRs)uwք*] &\:c!1%ì6b:MזaJ_[TtyWX["c0km`[[R*gmI  gs*glwߩ&ºۖĒ\"L{:ͧ;%'c|S~r oCI 7ۜmrT'wf7:. \㝷k !jr!>6`ulh5P[hB+(c; T%uU@6XKOPB_9 sRyJVWRU}Ť{$P@ՁռJTKT_;UX!*_A x5V: T(qZ!ꪢ\ZQax$`\[ab Y3"߽ o^]ouoO gЫc'zu"zNd̼٦ڨ_ɇۅ7x]}\E}E 3'\\W/#Ifw'TaY壊 $@qxzxL?ԵVP .&*hmt"P-G{jm)5.~g!'ևбO` C5'Utܿ;(x_h=Z4ɹ/_}yXC9pqPĔ @H̘%($*}&Ʊ/C^x&0S,:uݝ&CW!`95V%AVHFk"5UƝ,۳TU}ITvK+q )L  IÏ:|j~]zV^qmՉ&vT!D>+][zsE.$PBLzJ8 /: Lف1>gNO&SJJyO2Upd>8AK˙ɓ4gc'`9k97N{YksOdtQT艭œSZGkjV[ O^oe ??{CV#_C3ߎ$EH8N}n@[))S:FvH& >v+hvBB\Ddщp|;- 8VA蔎;Sl.X'nEXM4WKl($%g*(ɩ{ MVl @yR1gu }/<aQI2ice(Sp H*b-zٮV6o3ݏ7Y<|{H)P6߈59/ 'Trf꺢5tJØz[ҳKW>OV; H1*nEQXLIj)e#wiw ;<}tOvy0#`ݲ)JsSN~t  L7?=Uf4N\)Is * (+\z4'nS8tgqKzNVKVÓ䖢,^'U_?=8QQjB V2GOE1GY#42q2cwh ᠮGP6; CA{ ={T(܌Mq8cuM%"g֙&aN`U KOaKSfɡ.LQjL-,bJoŔAgIKQXҥLM%(ɔn&>ݥ)g3"R@ N[Ͳgۍx>sk[ZV6wY̚]W/J"tIn@2D([]_>%a~Vbˠ1 6 tAS٠~śDemS7⤝SAB5>h T=!Z/"|?Ur]:tWJj5# a3i6;0皌GlbLint869ŗS]_>3 .#TP|P> dkQ^̔W(#'R}YTuKW]I90ƫgTIulk/o} 5ovz {YRo{{2O4ē0~Fvĝ޶ ]PrZUhαY͈=xVE@Ss8&r9R~NNeP¯}TaڭnzU]DF5ioO:0YO fd9M'tc=1^:=Ib&jgx2}kJ́1^?SדXsj{5XtH[EPRo)+GKh%Ng<+>YI4}Kn0\ yCl"UOYXMq)!>!S͙~~07Wa7ڗ+IZ|mU#Tddh7R\hSJɞjl~xj"ESFxv1a^33KD q0{*In pE|2 i֚>|\yby^]egZ{6_b8C|XNư&vX="g65#JMMa$#RwfuuuU߯mxrfĈ_xy|GϔʑW41%aAG3V)z T5)裫4ӎ7#RvZ(5Hvlc] +EtT%f tG m94URH{nAͳN]58w- 9[fo"~hΏg"H-[3N-wL:^jVMw @w<,e]o!ؐ| $Y?>?sT9V+:"Qv:^ש z]Lذ'yr jsb[y.ST ߋ@m)] [_ç ۛ?MMxBJ+#^`y‘=l NI͏D"ٰoNNMxP;by&|ًxhMgn6ŌZS,`͕CcokS_ !'2I;rg @"$u(s# SQKIJw(AJi{fIKTY} J"s!qgpUlz ;D/V:?8wIͥN_6QWScɎiiB0HLm=ImzؐFDqEE{^-Ög -'21pib8Brh9DZLl{B <Т垆c7-&ξoװw [L+Is7wu}tmyN x_cB<-@&gnj!m/9_3ŚӖf^2 Q2 ؤh#H$ǽcҦ5Kmz.Zv{Vm `#X.l*E6U8(L N1G&WHކs^KLڒq@IrE_0˼2! 7_\{y EP]:yNL=g+T6CBLKs|pmV S'Cy5`^SsN(ۓSB2`8H;Hrf-o%V= yA9Etĕ/fBUQk٣[y:zL:~('ܞkUTTϢ*UkЮ]ׁ']uGp,N|{':լ[#jk]V(xw 邛g~pWM([ӜQ#nҤZILb'5cCo/jPVjD6) 09Mr|k_Z;)7IAj_1l$[ lMcPz$(`1Q_5{ct"l%>) ތ?%m3!lnXVK'0^YOkƹ_:[p)5^Dކ)DCjv)#_- gj9u9@ީh|Z}Lj(0&`X/V'͚&1[q4KBI4M%y5uVۉU酴W_\`GI|jy(Lg\уqVpɄ3o|2q /#>>znk#dЏc@K8)qArRmy%kcfbf5Ux ;|ɛdj$g˻Z AnrJ=OSwgXCw?T5ǃyށ)o}̀Ѓih|x5|Fa( ]FʅX^B3̔"o {no|Z>Ɏ۪SVۼQ)%܏ }f* Ly $7$qAVP i!!a@2 :} }-.Z(w%Ǹ!tB LH-=PMS 5BDOtfԍ?;C}P:?W %>R35 83o|XE6H,}2WYdukF†ɹnkB'ᰊ̎KxqWVkwD_n\?j̎?$RT)F4V{Vxڠd^D#?() C8¡>Ċ;}lS|dA#SjI414.,U{Ii{$p`U]KQGA r0h3|PЬšo-)M#A4?kؙ1:> *7ltN/Vo?/#Bʆ3WA*$CPI}G8IC|JgjWMa);^)Ma 3[Sdm-ڪ4ϐ"ӻJ@qT!e~kr~:'5:()裻5nom.kcuH Q]i$pbw9o"1;Cѓk$d]Q/Ņ]tFʝa8-zw5aA(q!i =EѠSk) ak'as}z+8[= M/GZ>zoXɣ[]r[X;㾅m냟ALOuJl˶?;[p"!8]Tr7{,apRGW6FPpJ=6f@! Rlj[cg29 93n~!u`6<-!k<zz^L?;+^.Vl r{:I'-!.Z` KO 36!Y#aGg`}Zʳ5+abCeWšBغܕ.@v奺KπF͊faHheuSN6zjIt<2v&- = v^\ȵ)",F2Χी=mpC 6{^NvQ~'D)%4VFd% ɂYX:}9 *wn>:> c24OTekLtffpq/oto-PkwtX*{~u^z2ԗ$&p5AFDp'F,r`pf>YgߑX8"r$gh>ĉfc75aZA͢$=D Qw|z>+ύ^mq8N{XnGK9I7T⯕ueWrtcVvNp@y"dʼn9n,ӻ?Kws=4cوI47`6wֽFRc%|v8EjD(YɟWM]jhp~AxZX{Oǿ_Nhc㐌i(\)s>B΂1LbR0fB9f+ƾcc'7Dv jLSgHf*w#͕poR >IA~Bӳ2oF*Mƣ8Soz]}PˀMc4ϖeֿGWhzHOF(^UG'n4M\}30BSuR=^(Yܫz`k?xeW_J]/wz}Pɍh y[ٻ8n,W}l26/`؞q8,6rK8`V]-v+Ү1D]]E/=Aٯb~OՓUWh>q;׸8xzק%|tw8|ʿ<]]s Yз`=[9r}:*s[O+~/|* 韆~$K~?.BkWOg}sQGs5W?ǧ_dZ_l~ϯ~ N_ozmWDQ/._;\5|lo`߾_P/fkiisYqYH<qu?^;~1ݴ0뽚轓},9o9:F 1 +]Moh 첶b-\L,VghkdG1 ĆDhHV` |:8^\ӂܔ[~n~n~>v@7>5pɻk /dzr B?KS9 ,J Q:>[|plMDEIT9V[xvAU&N6夶T//8v@2yr`L2/Ӈ/L MIA/>ـEwL  Թ rWTE['+3珙UtȄHZ߻ət5g՜IWs&0g2* ՘eXBJ@vrZw>U_<%|~dz,NV@f3Ry)R1ƫP:seu2DYi #WQz6D@%:{zRIiFZFVA5(FhIهR(6LqJRKALM$ᆹQL9m<V!nC2tAP"lN<*s5F[;Zqy} =gմE ;;mOwD$}k»ddd6$]T:˕. r6ADyB,3nv]~}FeFL~~+qowޞ7ѮyhP6[@uB ԍcݖ>n qB>p5Qb,ĸ;umVp4(mzϞ@m|% 'o|yaz7/rOb.wWf;9ϧ-/z񿞦>ktt6IIځlQIu$͕x`[g{yYyy:kw}y[T;d7w'gt~ø\yګŹ}ugwvmpn^C-ϝ-jeSf<Ѭ yl&~j/Wypn8?y߃8pswGN0_]̮S>烦 Mvr2eN"aV8KZDȊko\[eFhĸ̖ˇ)*TiXRMvB T"uA5MSt:WDCJRtA,壵&hE>xH̘bAĦT"ishsYJ :`ZuՋ[S 8jW"R~rBq֎ڪ;3=YkW|:X1]5]5]5plUk;%8Y{JݕenZ:u:1V5'uM} 4eݨ[^$ϱأ@}Dr< J57)I,,2ȬX@a>~=~k@L<JO3-ңn Q!6 P(d!ESd{M^EX||STm ~hG봵fv_,y.ÿôÚYsV$2J92vH%GtBz`c*DNa!_pUZIA22#Rly餴@n f7qƕWj\ƕ`i Hu Z}JQF]ujԥF]FI\c=#ɺW֏`XBoTsj;,Ofauk* AhO(sV!Ii 3=^z~Kv՗/U_q0n;lFn㦫]#ru+Kڪ(xƲPdL 6,JbY9*+<cRԄMLNku5ոVWZ05HP j3EnoQ{vR"-Q~oQ` ӎ0'~_V~ CX4fhW;I8C/è0i=z:`"W̊dDZhp%S%LJȚ٢EbN4Z>Z;jH}Y0GՏaPiAفm@l^^^^zX!F"j191,f__^W/wqQ:nyy͞_~\A^~L [^^& 1(k-|)[,5 ^yEqXb:Xs^촳I!fW_E1O:X ЀJUA8fT)yD2BDβ3:sp\NIn͕d- OfK"qNr㘭yhN.zW@ _OZɚR=_Q\DLʲ/YK"2D DF&^`[|wh%.y|_}v|ysgǻa0f#H>x52IF#(w,FkW3XjY7,AunmQWg:oK@3Yowiw]u;77-<NjW˛-B~߽=}W||wbdwy`%W׋Gƒ P GO1RZxGg/ sk;+=_ m^?FrfYm2F| x3Նqf#y|g$Zϲ]+a2.#/ dP`pwd(`RNQzk7{'Ǣ旹u)z#$Kdku(i%y=B0Ю3-yܵ#J)#D"Pg@<r A3ky\02!)hiCOpU,>fC@'삓y d|5K{Rp*26VnZ&۲>sEȿX >y 3GXRs sAJĜB{%d*SRGSUt{--D:W hф[T6b0V#v FJaq˔[ܪ{u-{3|fgFGU=9+5OFNkNQ,n2rvZ`"c`q"@a6!YpNeb- 9c 6bxr|OKH70މz:QyΉAV=Ic-˹#qBw!ҖR/&ib5 qM~]!lTBzAcf&NC;qhU2|hr.he4q]sdTڅ[x75ʅuu\UzPZRl76!I YܸLw7{q?'n7@LhqSkQ_i6"nV;0g7IFVhgqwcW# nr@6{.N,Z)kI xaYn u"v6urn2F;$ਮ*5Lս Ls\R,a'ZXF:`VxQ>0J4`z LJKqZIn~YJݏl!ddLfG!cK)#mq#0Qcx+ٸJ2@2JƤSVك) ?rj߉]Br Ō4K>LtIZ,ؒ\5B֦3+!zp{!bMazܣC廔B)EdWR̂4IB}dX` 1E/*[K͕q{K(gKuvwi.Y)}zD 5XsN39?:@5;|' ~ir~=*R)ez؞&u>#_޵FuDN FÊܪ$jj*w|> fiIwNT=4gG{B>%3 T3`jYcbvPh1w[x758Y iĝ7I}JҔV4VI#nk>X߃!X%MN ঘT.&AH2fgIwS&_7挓4>唌8:;$iVr./{Wܸ /;Hulw11;̋B q"ef$^b8A eVV^ix#8Yl`2AKR*PkNx$6p.|fj~V~#ņkŒ%"k6V$,Nx[޸…TBN&ԬqE+%{dWS@.0+=̬dn$6ƴ\&7Yl{&Q* oT%7m߷`E+ [oʥy5qQY[_[}`!qy/\,!ql-*.@(AH ">ےMJUF|IJXY[»b% 9REK.+A8iP=_/i.A8e튥Vmj/k֪'ϣ2phy:z ӻ}zA3%.*­"}v27וOs1{[HSPx=Ss/ޝ!qs.uȷ #.\VPgR>*[[>/.‘Ǩbհ>F5195Q-QE%TOxV2L`ꈎ1T2|ޢi=d>ƳQngg e#~5JP*5zhcg1S)X!Npz+tänWV:&O:spX m, TJ*.IBÒD&u8lPhn(T:xsi8*kQU:E?C>{vK)[LuR(ZC *R%dt8j%j4U2b[MwI.f ֨=-Vd؛z20{i6;O6omO0o{Ͼf8S ase ~߬g >|1Eg7On ,;z^%ew_By()3 wmadNlQM\zP{7+2_5Ișh%RævcZtAU Dt>v;=(5V}zrvCB\Ddʔ Y1}ui"HdFVgerg,;@ !|W 0ZahCY)ނwɺ=ǼU.n{O^t:=(oaGg`4UA/YM4b}y0]y-q3d j]SYb_'SRF%ZO7_-OyʔFi2K. vf`W;{Ye{[4Ç ؃F !1"M N1Y^?=m/%Wf/uU/ѹ|(}EڢOD w- j&*퇓[o-L2!վ\,C A5EZDa񂢏^.OXia"4bq*"JXƾ.JexB^C O!*O($`E͠c<9?(A!>.MYtQ4}|yOgѓ=xg#UkdYm5꘾X,Yn2o8]e#NSɌD'YfA }KS"IeFॄ™ݠPB˴B _N1PDrXDbdxTɿKxR&~GS#FAc('iۢ1b]dTFcj@ߺWl?!s\]ڇ?EHv8#0;-"J_ĒliF4;_؂1piRy8kڤ5i0 ӅS_u]\sY}Br,-k+5v{pT4 l&0 [/?2Ɠك8K8"ȋU PN*z+8gpќĪ@&]iyLLkP |jJ8S;<ͪɄn,?%c?Цї>@bs]A)w#RFL3)X)v(F˻b1mvjTb(}M%s %ZmV *P' [^WRiT+Zv{|}°E+"Ʌ ?^;yFt8{U^:p>c̑ =&R]xNu#l~@ ݃弜`!t].UJQZˎt:G{G␫S}pV4 冒Hv-[;!Sqxpݳ|zd5ڱ7 uWNs׈“嶬CKahٜ!૨>頷Ku:łwě5Jtu2'5h9ڒq%;7@mL53vo4.~ R)|ٞBk+K^d>vu4J#5d+$DNyN]LXj:g??ƓgeGCfr=G_7 //%G Z%(EKZ0c}1ū3=ٍf>7m(Ԟp`Q3$/5w>[s5w>[sW`'qET -2uc-`&*$AŽ#KMN ˿ϧ MuM2_ܫާ|,l:3+OO=烤cEHBdZ:Nu-;oyyhyWCH'DQjR*M8A/V FKǕEL eFYJ[;qb}ۇᄍK)g߽xO}/}pfX| |RU3?=$W XBЪ*i¸ήɚAsGhJRP oxRTLk&Bڑ90Ix%Vͬ {aO!Ű,} &aA * H Eh0bl'WXI\8+ڴkpwzV@A]J%WX ~ 9-hnY3z uwO^=w:SEQ̐W**+᎔)qcn ڵVgh$`{8vsڏȇcJSr,5 u9 ?ܐK$f9ʀXf_ Zh0}|dP+)&iӊc; I!F+._cLXƥ\_fƉEU `R`R*WNbMQ-(p FeiXnAEhRFA"6iS=']%qD+QDDrbcB0I=J qҴ z n^Yp׻nY $|KW|*'\o/p2ft&.g>> ""P#?~|;l<F^"e)~gho:;9R? ӟÚ)̈@J@=#%*j*~h w[JKW7ur+P+UcTV(J (Q:n)E&ᨉy/sLyZ]r=N`Aק[B*Ep{V79ýNܙKXʄ#,LF.<5 e:)-]#zr/_kfg ;jñ5G@>f_p.i:_ %H<'0+ WHt M5|~S[j\K6(v&-j*RLq5R[{thBk ƚ8DS-1%jp˙D8*rR FGeBS$ǜΰjFdUz '=A"=@>%6" KyULZX&V*Ɍ_mNJ): ziTs;?c̑x/a|dZj$Ʌөr);(@ 1Wa Q|ɋa%g{]c/]Gg&_Fl癛laWe3%bFt"q|?\zݾ$4gžI(bifBֹxk|Zk|vONv]:֞ Ɏ1/|'άs$! ёW+!E"E!X6#E!*xaWVtkY< w<Ӑ7)*x=ZEPŝ +A9BJ ?12q)2ǣ,FK>??APe8N5Vih!({kDxʪQ QK # i1vW˯{"aHšqj H{d ;-ogGs!hmL*lM ~Tpc7T)4׎HH%k[48Am9 %F__𣹌ߍ!UNBhecHA? I"<@MV؄!_(Z+AMB-I$WiL6fC #tRhX0K%N*&1vx4Tj[$wp 5jh)]ĮtkC ;*hcfLR5N(ׂSs ":6ǺBM=/у ͒{^@ubǵR3K0LA!GYx K|QH F (FpAyD*P2-=?%u@5Gn4-l֠p=H?{ƍ`2,' :yq c%3%ؒF-5b/\(m4)nM)k2hFhy_f#lGUt@$OI94a}f9RK(8P!Е*#jPa!9{K_*9Z  Jb5bTig\UjRS** nƼ_*ҟBU^0Y ᑌz纐ߺ.]7u)RFt E?o5~?L 87zT -_LTͽd@2R-9ih{dRrmUcyyUީ~uا&{x&9K27//>ɋQb1 T9u&ɇ<57ԟ<ǵ\ HsrB{e+V"YzvbiB*sVU}ƒ"].يHIjw}clj{\p??Y`Vaxk|((yv6CL''~HV[j^H^q_6)޺{=(}@tt=uYto3,E̯)Эh!.Mϥ(L>])Eu2冖eeMM[X(662.hOh^+WQ|l֔"Y F:JރwՇ# js\U+Î}Quw5c4f6BŸb,툱>sю1{@e/qCF-㶾"|ZnM~6 ~S9!4u04b*ihf872pժQ¨u!1LoR cG3IeKD(G5ʹhHpj:Bղ+-OnպaEC5/$J/G/BƧЖ;Q\,Fe\,{--ʊ)YgC p5TH408?Aok(QOp^sVyWi~c0%Qo[dl'{WCWCWCWâ^-ހ.2dVe)hIOM"N(pŌ㐐ql2ÉDo~>sw[Q~-6@W x+d. Y>fo2}ڃ, ci )zS8,¢|hq""f.#w%i1Y̒4Cޥi̓@x^ R8HjLsb;HUBYQH9j0ᖡ()G-))je>$2FSc*5)C-)C^`!i#E5R"23.jD'z$Is"Lw7"eRĴ&#%|/kk0ҬՄ`֖%ќ@  bp,+PTWKc>TnnnxԈK< 1cJ@@%y!(w% ,s\I(> F,p FK l5]_C@K8mZUeWeXW Y^-v3 czg鍦qw@(lpZ}{zqǙ cQw=C ƨT{]M q3_>@!b/-гy 30;n`}Cz:_F>1#qǧ2Hr7K>nmƂD/!k>K hDsW1pzKuAcc*L47i) ̧ms-XYC+\9\L jy$b͢ y2:<G[Sf ĉt0h~IGS|_5I\6|֘(֡eV,ФK ֋K M@z>Rv}ރ~stՑJՐoEWBUq`hK\\֖Wtwݗo`Eml!JXtoNN]v__kWrNP: d[U^}O=Ivv˸<ڬ"C>d>{,)m=(_\TK zO& % znG;E?_>YB$s_OAB9R7|enQc^Iu~.>ӗ.U[&]t]NSBȕF\p \򣚀8h g[ Bsn9grC>*o~vӍ]kmLNNN 43.jӏ sx's> iqƣO[Vڳ59]%Xe٫q`ېڳ19F A6Jt1vzUo}VJ^6A ym9AF,wm2݅jʡڰE(+C tA*P°) >e}]hp(8<s},'rDтuxXpճ5(%x"vWK4GdX,[RVUtgZ9:9z4Foݸxfu}}u^)VQaXL/n>3{! {9q}_ޥ.b|zꇷvp(zὛt:ߺD&T&xUu چKj*5Wtj% LE.D$&ߊ(q)JPZOdIv:TԞi,MMە-wʹ& 6lxUz :&Z3mMij2tqF f۝] Kbo0UZ!DshqӂEri}]u7Qbك㵾Fgʴ:G4 ~10I(\&h&E Ѥ-M?)7D2hyKyIsԕ9=exFMMc)\@ {R v;.^W/ȚwLL+#RKZG8Q?L}ԙӦbM#kgPv/l{-}hϦI :DldmfIc4>4d Hs k:7``plŴaAG{‡8O"Y94((LS@c H\jŒ(=S35iю1gŲ=3*()_`?2@O^mbIs2vQ*2-[΋&5DO6@` )BŁ[>=C{57$΀Ϥ& **|q.S,1R!R;:T5^M0 IT gኧ??![n]~2[ߺ [⥚1ՆX)+*yJqv 1JkjA"XK"+^݁j)^a\@+rMiƕvF?ϗ}_Ec:%}2xzkEg h 1L'\ 6 gF!!b˨!o ȼ!Jcem,XFD+K *6M)VA*TB3gcQ 3"3iM j+$3erbM-iCMs2d{K& 3"H%. KmJ&*6C-bB'RA,?/zxi.a.9rk?h5$KMJxe^7hD%E #& H:Ӹ:BB!b@.i ZkUy86H.6zWzO3H|htoY1d? ~ {zy_D3zX(8VW{Lg | OwĀCoG(nTPd˙F4߻?x |e/&(F߻[0'V>0hܻho_:n;9?|W`´)pU29 ]nFX z 4S*l{,P>o(״ |VUTREM2e,jMg'mlu|G)!018%ǷTbyM,1,1!Sla._zL 53[op7e?_b/Z=ڔaNFSM* j$]*N wr܄ήp?ǝ q z8ghdWlP~k7zɭ1 h.޷5Hjq!_KͿzTR}ǕG[Q_>^U`2[lsz}@0(CtBKf$.ýsZ t@RDbSmTJMBAX*3﹯+ 9Ӌ޵E/\|?s\ ;plrFyExWliF׈ni#_Ud*Od'pg].$Ga:c̙BS݉$EC &KwCܖ1\iKuYf"FB(Z*VFЭR69^O ҏލz?S4F9*41~~IzaȥwJu~V}c6?ղ+g.˨RZTY|]1IgT1V+Bze%E$\Dsd ʐNnˏr1Hw4nG\g9Dݲݺsm/Scl7 S7M29N#1Tѭ9`U89x~p]?-ɴi_xLLđ'Y"cV?xtg9PO $zlUi'ʁ\&v{U+F\m5\^X,cڢ(&,a4\j{h=83wQj2,m>7 fG L6k>X 08GXD2=ХI GOk{4 0>A;,3`Z岀,)Vyaw9ar8ȇVy٬,{dyw4eé#c=ťS1ܖE+SEZtMj|[\Hco^@BՀ[u:RTSca1 "x,Lj|,ԌR{փ3s7Gmsj=M8*W5G]McUEX*~&A+h|~ainn?s4~13n2)C-vM>ĕ~Gf6\ŝa]bf:1 Pkf~2q`Q-ER,jx۰?Zv,:PY$jRaԬYl!Gd]XVܺ|ŒŮj%?yTR#O\. RjSnAl'qEp1[G4 Dsy4Wp?/C١ N¯É֫vj|c(XZ0^ʂY SRVp(F)qShϮuʓ΢=Q,9)ck6M$vyc WCN[ ^AY8XJ ΐ_?Odvv8C++V;9򗴒b n}EkB28B7?Ǩ/sVuzQ厩==K:ÞcϪ ֮@?50kN1FRKv]s Փ0&wf ٣IY1^g2RQk*f Г $CyDݮ6z JXhQ+?WQ^iR2pn`oa%F\;aIQR%# lH0ź=QRδ0J=,AVc :)DZKTP^Ei.cpR $'EI&c0UlȢt4NE_C_|ZGTJN9W`3c53(ԁ•s8 1TKDDdG^UwɑĴr մ0)KʕRYߣh" f C4*9/8ǔ|8vZ B0DN>c`랔b$4I5\bTXm]2kRȀ XU3 \s&Sf6cТO3<=OΨrď,@JZ]XGzg3wM}?[4*0L~64M?˨/=_pTj+Z`gt03Vy.P-8OooGbQdZQގb~,S7&e[28}V9n:@ F̔0тfq`,wL e<&RS4#)fQPdb Шb 53*ޚп\yءslo ;7K"S SDְ1Zj,w$7֔%$T5u4X2#1B x|wK1Q}ccQeeQZ` T|ozyG*8O_̪.Ǯ᫿Mna4QBw{x:/ hw-.#VܘOK͝ww>6g-BӣweBH&SQqG +oG_ Ewܦc:eoP&s;&\F72 M uӍ=we<U<i JSbdgC+"1HcxJJEMen/o-ZTS}q/~JmwmwW;QKqg~^% 8T>Bc nrt9 t+zK:>|QHq qӱxo>{"w-:}J=0o|,}x@RBư\)`7?`ʔ^Kkܕg]ɶ+ Њ6qi䖉1k;}rYq3W5ьRN)Fښ|8iUzp\ nxE$5߮8rioWe_ ;?e)d˔"ks;i:=e\^+It$-{GakQNi8:|aݪVb*-%<;Ji6 ]jEX\l+{j18V~[#[h՛:+'3xlOu*o|ZMqβZ8oͲ(Jd2j,'N. h=MFW9*UXjr/7 Ĺ奨NFضeݎo C"Կj`/V-gC!ЉxJoV"M n'>go:dP1{ń嗴- }yAQc5`% ^J'1jDPNhÅ*Q16y<;p~6rXDPTj  Ψi#h, 0Eȩ4"bZ/BʔGf@L2)b<9[-p6T^Ee.}pgϓ4g?OE<$"CNht8fU3 '1c"R.d]FɺuYE?\}2ƨq.VAq6iQFWao5 >X>8{ͯY-,Oxb]6"w{d>IR0>IJ}$[taҀɇ9փ3s7]sTaܿTFp9fӶnCHt"$z qJ5/-I# I-(`X|KԊl޺(eW<ltHцzqhvzS#_Ne?ZtˍOe\~oWثp@48WpK[4ic$+Fq$8~ "GyQ"^`#MP^c%%U!h&.e R˨҂:kf{:>^K#fK'm xU=a8Ԯ~嬺Z8s BF+X[ya j)uHALcB.PN`F{KḡɘT XomG.Y{:\!ۣ mD`a%fLbƴF8CKlI(U`@IHe(pYzk &3qѯu0 S< 7>auDW 9!^'=<^B0ז^ pk4LXox zq)N5 V :_UR#s`yw<".օ9mvfIu%^ߒK'1zJ?Y f锔X."Ak DbvԈXT)$. gn:!Z!,;jF3S1_ WO7,ZqѮS+%eEĝ$ `J95ڶ+cH<dk\U rm ֶ|,%J̈́eXdbꏰ1#1t?f6^󟖥m!hZhp[J ( nY*xwu1 +0)e拾ܦ(!0APV"*G^0{t|hNS^y厂(@:|W#.sVkQwI ʹ> ;`_Ń_R /,V2 ̬cJRLNeg#8g.ݲJbԞw|I,c|usӌz b*d`qd7LMmMdɑcݒܺh1hz*g;(<QP$}Qja#I ˵َnr}jJ5ᠦ[p%#  ]vN{㓏k(t <`Og]ftq:v8ϞPH=qoPW&>܉:cv .]INEM>y.H4Oc3ztx9hxXKFj=Oa`8_ > >gw?isӋyӗ')Kg<˟v:n$-cg闟//ߜ_~ۧ^*Qgxpit{; Ghg=ӽ~~pt8 _:u$8~k@W9c{8?k׷q\tɷg_nWu?Y-z$+,׶'3&o^7s]$jF7?GLygU*"M:=G4:7tݗ҆OfmX,P]o7..2r s|"4VH?&~?Ӛ~f;t|]H2^~6Qȇt}c჻gmwԁ}Zv= һLg//7AQֺA10>m5;J*)|S^oyVӢ~no mNn >@oR ߥwֿʆ>13Hs?ʠ|zstx=z Xv97o~!u^e̓^ud3*{ 55-!7?Ot`<6degQτ5?G~_}3QSTd1CQ`h546J1 J;*B)0|S~NDHDQs>C4(MhECM¢ ey$fHBɼFՂUn[1X}Q-˼H}HַJ|L~i2$ZaNMh8ŸԭV:B&!EH/;"EK #)NR{mrV N"c+%Lǐ3IdmhO:N sDuZđ!ւFb~0*M;LוѶ,Ô6͑q(ޠG.p"ǔ&\L'XKfYB1s1'h _!SؠVTʠV`La&`iLO2,qZsɄӘvDjIcJDEEUx;QY]C j,!.gBUAx"E~cFp xNYbI3m]'t4&K I".0$Na*n)!jC&w0p -XC,%VU%ByJ4xLPSO)Q1$;tVƠr)D:ITp010q⭴ ª`i Z%9cb_dHma i5L躯y<6M4sC϶i"qn߁|.R_7#jO ΍t~ڿ& lkTzY47W-sS`1$sZ#/݀!Av:c\k<ЬЬp==Bdcb#۟`,r]K,yX۷'y/XTb#CKx%ԺC{M,Sm)fTVAҐ}T/#9ӬCS"G4:RH(-j~uuW2OT$r&V[EpYS!C8Z[sާƛjK_xb잀P0/)?ۭ_E!Vn^vk䤕sץ'ez%N  AKn:i}8M+~Y[I&%I Vgv״`&~MUf2?!\fU̠EoӍZibD9D?X V&z(#%ULT&  ⣫_Pb K%nfalc3 %R޹>P>JȨxϏS+E o<HF$t/8QL֎fV?]^Z9R`E42]"*nU9٬ڗkUҕ[Gu*"\3oVcG 5Ejaf)qjL 傫8!06Ҫӟ??C|.h˙dLbo=-=.ۺ#к)L(^0: pre%:Mam2SCE`dSL A޻7.B^Q;_^vD(2`n"q F8VEkX2]j"M%5;-KEȾ TrB6H,}OVXzah?P*Uwja36økM ^vUfEKPUeb3) nYry#&$ȕXs2 y|;$Esw;T˛6iFkޕrB@Bt|?^)v2urUܤDt̉A:b,Ik} EIs6U6U}pɍ bɩ(@KofЏӅy0:zaUys2t*wѵ}o$\dL*+.x6{PNj.ȎD>K{ٯf}vx*E1v*d\ҧ"Pw!w+V|)F4]uqT~z7CDM}B"B^8>DL1FM89"*ͱQL"g8xD]ck"na{xajczui3_};|Ca%"0⨡c^R]R} 6Qb6~.J`龞ߢH^cR Nb{tgѵ1U~=`Er5w_?^@ _TO,xx Yνl~R R5X,O05,rDӈa# bq=G9-#mbMc⹉܏+,-E IN41r-=ʳMEK -Ҫ#/Ϻ` ܀qQ9Ь J+9:Xxkx킄 $6}H:[?n/3RC 9qwgOp}JPi=ub?gO u.E Թ:yBI `Iʼn4TatZŒᙴ+S-C2Fi- rz:U{i ` nfVي:-ƷC him~uW>Wu+CȦw'W<ؚb$l"mh9XL%њ: FBԷ2 ѥTdT-NDA~Of o4i4W^3*\<7 OopY<%xdsPEU|C`awuo!0Uu]rĠ(vGCTsmHEC'_Ep{ fa4M[v$y_ՔlSe7$E&nOUWUWWW-n N{7/D_u9iu7*{AE?<| x)t]UHJB7^+ o|Q!]Tv"bi #MBu9Zr*]JetYTAFzҀt ւrAyB(Y*u8L,wJ.:!CI9o-FpfFHlStՋ%QDRQEK|ϘOg:E q ׎WHI\y/|/,Eíp)#+&iIKg{Z O~.a^BIX1{B uV)S0 PQȏ뷷_G5n^d0;\u2wQഴҢ) N\!49I@7&#b>Iڠg(ҡqS4|DR}܀PmDw4vP J.H "4TGt!, \⾖ iuT_܋"' IV0'r k (+CD ubR20 RKLC8ztTir :g<Ƿڌ̢4<̣-)HRԯsgū:{@2%ԊN!~\ڢЪu2ta /6Ϝ74 -eyRˌZ ϒB -0Ee3 {\带Ԉ2UpF8DZ&8[(ރ}yq eL(M]V` cbXFKIPNeDӭT*#qSD@?D2.xwV-/%b ~AaSk•艻DoXB?xx ʀU7xscS§#a}n# 狫3-&)Tonq~:ڐ ٧'7İQ2gla!0ּԄgFP-+I=]F>&=#= )QF 5G^*XhIOycV*u{}#*" ?*hDAOFp#qA4Gk^J2Oh-H0h^@._I h4X.$dM ǺjB۫"H͈J#.79Rvd^{Pbnwb8qeƏ!.N9N 4Ȧ@j鳩3 eǭOgzo?}yηN|fu~t}ٻg'4A3ލĤ N z8urԘnwujTs b!p3'u$8//"K"dyu)ʤqh gd3eqDA Y2x* ! otBdh!x'0cw/Y%r-CScz/v Zv_2)ct7ErN$$'1k=ٓT]$)$aWO_:$`İWz+Hf5AQBe E칔Yz͸d 7"W%%iTۂEݹ*:$ث=s-SM%u% ^gLQ^hѶ)t//0}iI!@o.Cq_:q$ϲ%zd62T7y}c[Aqÿ$-zznt-(|cĐSrLMP ӝ7:X:[#37( n6~'|sy?{}#XKeMN2b@n.C[w<$Hޥ pTG.d<'䏨t* 4YOjzpKW}Iy&ET'ͥ A3υTHKׅ)@剴;qXkϴDO؅p2{]ܮ7;|tKM]W k[>3nWhui6PR% Gr+4d*fV'- u/5Iiy'z3ׯg4eШA^zYըz׳7}:=\I/lvâ/;A=Ѹh*c "i4 O_%H >yvITW*iWLʼ+4DM^g&GfԴcY0I+ϫ:)!~*ɳKЄt5 Ig~r`աIť5jpc<[1+Cq҇͵,cڀ?B7 ~8|w[DqVEmFK30zkOƜĞ>|$p6(Sz |F3DJ%uI0#x%3-oFekZq\P2EHFBdĖF(#9x14HPQ)<ֽr$I`?l_H,*᫋5RnOXwXNZy$|F-.7vwU.]Tkorq.>Vv͗%Ceo5^yvy~>~䩤J@-Ѡ7Ls}MbZN>I3u@tcBx'ZˣNA ݱqH4-%]-O/__BۑFd/JO##*o&.k&\(a3+7OkiZ=C/7;@\h4K -*I.̅^H"y43$'3g}(vAIkCqa땫kȟLo?g2CA4 u"+8Ϭ`hC2mf!CА 7 CF 88n20"Vj;,%/H4T7-ͲF;5@iz1n2FzLRrI X֕F+ u=)XҥI&6II]NUcz\8 y# q(eАx#h>fbTڀ܏HtC(bCC0-45ZQB}@PG B4Z"u 7 'x^ 4F&H+Pj?>bhy[n>fIS%qo4,w*D* #5?o?\ -zc@1/5F ">%w5fU^V6gױYb,}6K,jpl؆pT.fes_γwJm$ t+- gO<#1q[y0OSǦ;\{z d(:9IvDF0:tO+8^{[y=3{>NaaophCT,N#y:I7m+ǧ?0&hzwT"qS"yc7v5 :/үPy5}b{Ms;p^n!>|V 3*X޺q[?jpmǒX2rF87G33縸A[TQl܌9Ӟ>>4Lc5#:77+ :@ax7LrMr]MZ3Fz!;L3PZ ʅ Ld@+u8L,wJ+ɂ@bD{f kc2cZONrɔ:%dc=cW: Sh$%&4?QU]ثM&+ ?7u';;eH- -Lڲ)PcJ*YNe 68+}O67fD]`ꐷ>QV *ٓOE"ՙbNG%%?|_9P&'J L,kHyd^ {w.Q{u KWvA4T6jmLzf?$5{s ~IP<Q_隈k:]I't.S.CS݀N]JX5j'qRPå'&b“䉌ku7fuJoJY䙜|êKhKۨF+Uk6(ΪC>5#Ň1N6cݟ'']h^i{䡫G?{S=2qdQf?bڐlCb`פx&6l]k(S NL"nnY `x"&m H*0&qj箰:9ݓ3& Le ѐ5Ӂkp7"NsKj|P,z5Sbar9/w~eLuhGCoJ܍\]FĕP ul_(jc1UafĝOO5pBa6TarCjmLbn5$`=+ZRݵG'(Bx85{l;x[*"P;gyi/d€6 ս^^YFw_eIYpMP-npNe>@.o?\|*,rPy\=u_!P]1WTW׳$ޫH/ہvJm0*KR}vlHkMFz4+U-c參GW "DRWʧ}LE)&DUջ IWvej I|!ӗ6Rϗwo?dx 3ͨςcyҰ=#j_*Y#WՔUP^(]ŀM[FXJ+V֠XN<3$b1F[*  eǻje:sH>~{/뾡cg2AVZ- d&0@,˺ʃO" xM9Z35*4ކ`iܺ'KE )]&Atx@@*5v* #u^ @eG.nO/).ll@߷"PJ&[P $)Sx$]ԄV}0sD[ OKvkkD*ݰagyD;Ь;@pUv5md~Ů@rrd&'VdPHv}VzxӲU7 ϚA~cUB‚~)~J* k 1J-c~4 ? 9bDA{I0ahfDTAZ~|1’jYuiwJZ(41L0 T]~(*qkڤ2ܓHk_?8g(#2q_9>~N6p1_dv!n|`DU_9 !?Vc ($%3!ȼo69A %IgMhG7 BsT?6r$d%`O)1Md_o]"49:㌅,}ɽ\YiVyi-K9[h%ޓ88f6(mk7J?Ld T/wSTdA\ȧGunCo-QiXc!Dv$5%L!RUmdNXZzu7ٔ!Z7 ={o~FHޗbcqf[nJ'+:O? *qEǘY 3J:PC3Y_15FzX'6+Zhs,%%{ ^]>W7fy: TKJe<"L奿3՗߳bcWaI,IS˧ a}Ilf2):o+[ 㨒|% <\ov~( k ~޿;19c2rm& m! "r|)eGxJՇLi-r0=N'([8pHd#/$k+Eƨv0[as*ȏ&]nqB&hXcB]|1z 0ayʰ[ MwXEZ=?O"+{7 CS6xDXs)=(Ԛ*#9H@Zuy6E@ĵBcg= Ri P XM%~Z.ђd1τds*r)c2TtQF)KeAT7Q8iw+>l̳c i3FCRr5UX".&WЕP!2SGp.;x_ji:FD?ZZ5]賯G\|L+#CI/h]ܠ|Om!ɩ a5ɍ DM=i^h/PikU\gPΣB|p"HwPO:Y~q/pGLœH0ghmGmARwf'vRV\J8j~<3nN5 a:!h#D pNj^&/k GK)eP!huvV;TW\"ݷįܴ??; L75{ߜ ِ{!ZŏegXs}yzf~V|=EWv+C鉻na/%~_%E܀|-}O~{m/kɟQ%䗟_θ/XzNS-L^.T5sq?g3|$6dgVrr~½# <{k)&BS9I^Zש9X:M]y-(O hZ`v$Ѽ([v:lA"HUJ޼@\E{—VdOQUwpȁ.قLZ=AP;BH0T{̈-L"Ϭ+DJ p`WIp,6ٝxδ90eM|?7 k =bփF}Y@R|t[Fxz ʀ ՝6 YBx\zx`w_S$Wc5n̔r=^HՋ5߸{S"-t:R×)q׼/Gธ2uo_ta=Jr Jڇ.edɏ.y0%z`BuT~_Sf=h2[GϽ*Jºkso\x6l~Mƴ0dO7ʳ|+DpΆV7oxx}w9[w%0)m>cL= O\& <_A7fZ)i L@BүD#RpL{<%Lp!<4G@z2:lYmzڬjDAו *,‘U|$j Q t I!Ӿl h$7*Qqk{ \19x B9*ϕkP% Kg g9i;oJy^ cD?ōmr4lZ.gu?TKd&'-S! (Uq\q(8˵`II+c+QQ+JpI9* 8iFѮ.,5D*[-Bٿc^8oJĠhPsH)wUK؄&lsQ9[(9!ңaT90#EC3I$gZI+BUJLPbp3QhZ“B+FQ_3\\kֶHN$v(B&l1L%rIVWFs@%i SdH=Z΃0 9n6<Yod{m wmmy9'r(i46Ab-]حc{%'i?C.ْ/dZ~3 C1E6W1OG]Kx3C(wf׻7QFlqL{&Ӱ*t0'9 ިlP-oJD7Tu<P5)8ryMÒ6A($*8BoیS8^dgQ]N-.hY M+$H&A,ȑθzT6=9 cQQQT H:UR( ,`5xNv/Co"%Ujipeڅ%Le`ڼJbmVΫ?ǼjA9 =lONJ%H(V9 86q-Dž(2K޻;O2vDuRLLGk2tA`#+ՙG]Z6V¦H_nQy$JLC}ǘWZwt>>`ߤN6A7]]HjgC\r`3;}̴FAY\|݊EۖY4$9 ç40L wrs&~y&%%\edN`bLMOL6@t`W= l߁drd!G  02ۦbsU󄰃,G v2oКNT;e sl1Rq2w04!Yv_)/fd]w+ VoyyEew/okL̘^hwgx+frG;<,ZN‚}?k7J)6q!m%Q_L6~k e$DJ7GS IOKrZSa.S\\^iiUSqV&0EzCm. o߳3M3ޱSr[ WjcXn?Cp~$7IA1S;Ȁځ`nF`P R|y!ZlX1ݯ9ڦG^q/g 21Vb lˡ/mQFJV,JQlS\W]@GhgW.9*bp֬Bj/lV^>HmIJ?A/ di6p]>ն%~W.(Nִ"S.B"6AՍHn:~H%߆ q1ږ˲ؖ򔤍ؒ2sy`3JڒT1ʷcHJ!Ժ*%\J \bQ YO܀ב*{ĕ$5.u=#8.bk- "(W `W<=.[8V*T [ xy+2YT1>`Z D )+88Zvbi, уTBlwf0~LF\056:ht^XPsA&ۑT8.uo)+h2uDԣ<~Vvt$k̕I%(w}=Mc̶wlFd*Ј;>p$}FK#-J92?Y/ r*N ֈ"=2czf,J*8,jt\8va+hbbLW^\FeL̥}s7ӬtNFUl*##XǦ}}2ceHA@lUr҃پuM=۳;V'GSe a3r3ڡווc@#v7D-]cǿ>yݙ`Z4lKluۑ: 4;piּ040:XUwLz u*zI-xb{]+Ó9r#׉_ۨUfv$4E?_]17;n5":kbݦSThk&W|f0%|QwYCM=co\ v|t+êF7"\{5v8n `~z>_߀(7݋V`ow %I.['vNswN&f8xi3P @bQjc&z #Iw0}ë.8H']__S^yy|P`j?!e+`l`2h%AϡzE7\۞'X`-JQ` ,v-6Z#d C{/1gF&x8u*фLg 3 9!(|j #}%F`&{wH&:5;ͧ4czEnz>4%ճR+qxbϮXm ιo55q֥q{)6{\:4'iP\+~]vƇuWO?ۏ|Q;=g'~|9?>,:>ywzTF>^qsCAۍ8z <GU)цqokn߷)ͥ; >T0f_u~kйb]6n1U)_/rY*saar*Auq\ q}kt:pճV!mW>0A p/^sJvZt {jA3\?\qiKN(vUضJ)Ctx7SBs>&e0w|C+m.k/xkbf4DRZ\zxǡJ\F?txloE'e= [m1A.v"WZ`e ڑtL\X /d&@-;gthg 0c%АDdՉ:SnB̫3߭֙H2LX__:k@>Lxޤܙ`?m[WAsg61,=hQ}D#2GdԞ0+̓0IƱ`',7p1RTY!dQy I}٨"9rxG*]^Ă~ (G] C|U+o k!~&֥ WءZLZwy1F_C מ-g9beƒmI%onl9%4ȗ yu<iUpj> iJDG0S<ͲH(^<М./SHwoDW]? u .veBRFJV)fSkzR2evOQ)zgUgJ-v_ai>raG-ʱOO#VpRߵ}nq?>9aCD`ϘyLQRW\F]kդi.d.تwg.s~YDFghIyɫ Q8Q+t#"gG7KHAGW%$s'Pۤ\o%sg,;-gg@Ijö9FIAo @N.>]nʼnmBm9{KG1EʔM@v at O,=Q(3[0 Cbi=R6SnU;D5zGPzWZ]Bd >u-AS{$!`8]>իحOm,ZCJdBhu¨ 9CIC/2]Fp4DTYKi+*ʦnS  &丛,Sm (ki9`\Ek" u"ft 7ӗ _^_HU%vr}P?G7GQx2AKfd_ʸG>.‰o;:ܹ)vsPAѵs%z]Iͥ}yҥ-RYX+;%!LȢDB[NJkB}$0P,{W&#]9+ IY 5#?-k0ݏIGb&38S$oJw?uϷ9L ^,] ^LlۘUރ7)/ZǼ\ Br^0L"l ("daEkAGBV[h %1J8$%vSWXnc45 }Ivfgr,/CLmMf˅D߷A! s!iW*Cн)TWW9/Z9 XI{s?6S b`kgӧƟ|_Z'J̡QshE!c͔=y*Hu  zdpąl*b{1"0n*"Y];XKT+0;:{jFtj)%Mםyw^LHT{쿳v3LtZąčl.v*1T6a:r3-fvw;+ U4/Q&O35Cۇҡ GtsOۛSο:H퇡KwZ= +gth6i嗖R>qr U_{ᄎ^__77;eL bȿL!OC]ȬT+La/RH RABX0IaE`M)"wY梍,|˘.r5r߶XZ)f ty<pnDu*FUMª=F7&>hf=T0JQ Om}j9T{((L ̖CP5yo_"o{~`ZV=ۺ\%D[ZP!{?0O1t-=:kK?y`tdGVƒI  Ͱgؠ2c/LiuyǨnaC TS4Fm?;o6 ׵N6CC~ۖ]ڗV7g{F1y .iT쟿?`<yUwE^쫳?xz~c2e2m8#(xo| 덓1EpS½u20B#EkdĝQ3yR>79@xz}!FFg?{{WsֽX Tn/,'lv!/}$x-UYdOaL ax✔aOF )Aք5VYox2u?%>\8]Ws5qZٻGdZۻr?`Wc7Y|fφB^`td<*HS$Queox@F)]W_5;ԉ%C]xJ %SJ-Nħ s 6ɘtXzRJ]'梆n^*.(/$Up*J#l,3{$n6|cSpxAY JTΐq`Ϻ$^(VO@9bH0u^~ %9)rkkV]{p9B]m t>V9\\*'0h}eGB6KDϢeSfH ^8abH+W~ Ao=P{%լ/͏y0O2q@Ȏ4"V\]QY )B*B8 F+p:kQi8yC'u$Noo)Z/*LC3]S"r~Jx6+qS~'}3WlP'}eEkXnOYak[D!T@|u.+tigqFy+A:2C#u4Ag mwnN)g3'5=cAEc@ gOqn3H ǟ;$?(T uh3>1go189-7#-χr]}nbY"H7~QkJ=$\KI%#t'\uɌLJTֿNT7HI++,lC5^Iʔ̹U)!3f{you:`t>M!L;/SDA/4&4X5 ^CqCQ޿SNp@\#~!h.z)ݨ xZOGO NNW~CXshffM%4٦x"QRSEɋ0l}(a 45D*4#n{7ۏS_ΉS8_ëS"QV1wWbOW|VikBhƐqnarr@Ll!SMtb;f&$KGΫVhDo,;qjʗ[:R3%uO M̆ouםg %VϚj'NevO̽;y2VZ_kb/橏IC m, kR(vJoʗI"+MbqkŔ# dwK&6h);+zRk$¥'ǔ8Ƹ"EδfZ!N!xAjdajiQ 8&xH]_\,h -sBR% 0bav4zig!䠥1sfՆ:sr˴OyoBGn^aaUCkFfOc.s9YN4LPF{I7w8b6L=W7Hh%a`ܿ\5\*Rd> =xM'4wXR1:771I^R?š&A w~>0N`y;^PQ` ,VD5gS~gU(LRp]ijzMz: 3 US+ҤʤJF ܄. syoad*I4ns]Nnj5c*|$5X]h F !-ŵ5^G#Za,qJ̀Dm&)W")QDBY|'[P̐Vb@rg3iUaV l}|؍6Vd^&4kiCiLw K"?Haz,9v=9[`WkHN!EY;Cf :Ɍ|xNRh]2)xA`(^1^3`QDQJ븠1r)iRm7'vCENKU!:fT)?9%Rp%B8O1qsB`(J)TJ8Yn|h c9ny C u$@΅D*J7H08VZTKR +H[Pj!ek#H4?F74S:r3coZ-/Xd7-wnAˍ)~APFw [J 6;LCVؽ< (7Cq#T*$EMRCA9=&R_PCD_7`U<(Qɨʣ1{R:@BS4L* $c)Fb)F،5o-46VL(8v[%!sgxp1d\ SA o&^}_~EQdžb;R[mKv#dj^\0PڒڎW [J@08vo2MCB3m =⹲{}_Ym=`F2wg@nV,t7NVm t{[1im9ȻGx}'ic˪C嫊m&xq7/i6yF,Q|_y&hmKBa)vxfj{w^.k>_Ƕn\tM *q|^VwWe"pwOCv߁N:O;7yj%!>^8b>8&8C ql~?UB Gbxxt]ebbBZ kB[{“B S̹Ujϟ.AN PlDL1FpXzQ^+r.4ڸj2LGCuk!șcYRhen׹n W|d|otSX(Keb\U#IeqJ<5מ’2XG C,&O zނQ1v - -:aϦpm=r|eQZ"Vd%)ZC֘N.r +Q^2fdEYaܐ$##\RY'sݍߝ-~?ON.<ߟӳDb冚\aˇ?a ENuZ9>Q@[z]P1:_mF3r3Cyb;]9Ӛ8Cq^UW̡A\ȇs+G/@zr>Ny)i^_Q-#{E8hmڏF7t!>M,q.F+ -5,%3g7AvwBZ(sH/l< k`P+(G L-"\**K!%5h Q 4۝6c,@ ׂ-~m)Ch&idЩfd>hjm3L-\H{=leJΟncEuHϞF*p47Ci!oyPtc3g`7k|\&Q3F'Ut'͓_&Sóm޺_ޟ{tw.WKI컧w>xA+$d-E<ۍ4̎}m$珍["C]o`|Z.0ƏzRj:M_~_?r{} u7w_K-¿鱇`˖[8[NZR/2%cQC}D`Ov48_3T:μ ^yM!Y]װk\x΂GU|9"[Tx f;NSsD`Hxx2E4OT"BHIOT*1ݮuIk)V'm֙η䐓;/HFo^Y)Q}~5n97':~Qjֿy̹"+iA }#=~S*Y[fڷ[Pk EJ;e}h7W}? N[JiQɢOaW~)lA{#2P 'ugeV:ahOru5įEhƹ fR쾿 i%#Ty<ֆ3e7.kET,~Y+A oIoکMZmPEJÝZXtzFS Ivۿ8x\eˑKܓ͎s#co5k\]15铖xcP׷_f]NZ,7]ɥ# (rY N2:DzjaW7lKOy*|lQmYPfENc'-ј[Y@oyt!7L5al-v^vVm5__ˈ(-.8NLZɵ*Ňh#1xR%]BZth 5zیl~eލAI2}Vi-ch>NsGpi@MI'nk;6.ɖ>4@faZ"| g .Ah/[Z&{ٔiRGyFIn LCalUE1]ie_e[wp?@SsEg GM-{/4"Z=Pui4]H4 4WlHbꪢE:Q T5$FQR; *V3vZr>ڣF29W9k AA@8䫺kSq|- ;$8v+m;r)DJUlj̓uF'A!k6e7ճrxoDߑ2j:`l0ߖ69BxS;qa`]Ptͽ 0eh8*Ķ܋(ǓaB]jZ!Av xcq}+oVlm9eFn[+gƢ M[`*^/)d uS*:2:Fal6:M٣׆3yq%i5p(&*L¬8L5}FOSUy]+yfLV=k1G I>V gCɿI7K]ӗu&g.ztqj ׶QuT3-~Lcb3o5ft^:O*:)hRKM "c5(Y r̹kjp,Ɯ;`?7S7[ftQ7wskݜN!wWLd{kj$M/,94n泠 v 5VXfI$\3ZnMdgR2KY4W\tA %g-Α*C)B8Op 6 \L=hriTH NTGhm@<7O*jhfU/l5Emw!K!z-V+M1%񩈕Rei8QT1UXѪ4faD-H^JTEޣHb T$Oʂ&V2dXjEBÐas?p~$4@"1fw0 z)= Z;lEѵ-PbHatƸfO$(P4f?A4Sd=n֒[U OY2ϾOԛWx?Ƨ;ݶW0@1Yg> ?ȧuj晧@BJrfACt1:&+ ?1TAj)*%ӑQy2kCDtĖO Aʃ;Amc m TV6/IbU/Iф_Z~Yy?Mb??>0שć??2X{xwq_/#ٻFn,WYΖc0 4|E@ɶflIdw'd,UI,CҴNJ[.%crCA;0:/f|5?=/˾nSE82o[N8QcV5l.1gIn`.y4⃕- 2}5郮-+ '/HpJYYB4&cH2myrcpB뜷hZN)(DGv>'lDO=?Q$ҏ8Q=>Β8ԔꂸKZ\Mbo">Ց*%1vImu[%1T6?C: Ml0c2>W&ya7a<1ek VeO~.̝f}+[ubW ~Pإ{b*y_]FH/W:y }૧\:ǝ雫[\-:צCbE6VGny>ſʃ'7?:_:44<d$*9{AhsZwH+)4k)t'ù˙c,/_v:_Z0"wUF7[-[1!_5Tnޓm.'߸S9_5Tlg=kAԓMa~%[-0s1͔4S) O,@@4$* / s='FȉaB˾ dF&Z:ewKfKșHRc{6.:. B!rlU5o51S%SfD0sxqs*.;Zs?2~e5i6;ؓn4.h=Ira*]F +Ezy%ȣ3Yci\)XW!/Nh4: #չJ{ BH^b2¯3't]ۗaerwRGwDUW-㉽ LE٥u}o@Ⴧci2R~oG Eϧws~B:_\ xW'xNjpQ,QG6o+#_=mk%e5ŤVv],:mw1@k,&{sAH~ug sq,0!v:Y9/D|<nӵeٛO"1!JB6,SWenT Ϥ\ѪI)),|3 x6𩒮zgm' țiȓ$1s LX^^9V>V=RK8.3Kr; qaz!Dp%v^S w&B/6n^9⿽y6 ,5:㔛Q2sI#=aI-lLzrLVY %>(%`: 8@ǰ !FI.{v2 8ZהB<`VSe{2A<ڝr<33N`fb_ѵY=saɐFnUWNjRU`y "aXA*&F沄bw&2(Ԝ5(%=wN6 E*iaURF:WIxpHsXFcx1Y꙱3G) xw\>Hpt;35'2 ! !87))Fe P qXȥ`,t 4,Sl!$`=Ufn.k`+(|G[zZpT# f$H u0{IsWŨj Dyϙw+e[$j*Ic|97dJalS\U:/V s0NWtŅ+)a60 qH# %ZVd_KrX-9A#9I<:#&m6G!|Ma)޶ & IT.-"2Fi,LY.7eXxNF aZ2- V[,A`-"9 Dy̼R.29I߆cQ$-sG'boj{& C!;&1YpVg]2Bh(4h\ӒЋfD-%VKR[8ZԂ@1vZouKY5@`Ky3^+D/UѲKBwXRȲ)+޲).)\iXŻTEK*bKq:H 9Pq+NsĴh^A@d{4LC1w%|f9‹Wi*C8 ]p~pzkz~;C7GAe I{xR\eqyuR:Ȝ;k褐siDK" ^W^t=p.` tkvy-Nc!&J t^~pN{a*NP9X[̹ě*#}pҾPuka<1e ]'|݄C@+LfLy""7aW"gϏ( | |2{ww-Cw1zݙh#m~_ͻ< vY`U^8L<&phsi!Ն E 1eH)Ksup<.yAɁZRHуj ]B㴶C/@kPN9 E1S4Byr{)prƽ2A7D ɮfE_]d܋1k%1ϗmlK4fcJ"$ qb\_1čך ).QX*&kJ"x5RUdֿo4$AZP#eWO;BI>yɌ̗i0Z-;1g4f8Qfl0? Q_TI /?lfQ b[%-1k`7bBZDrOO~S3;lj_B`mRQ3r,B3> qgzsҏyebp$d{s?)Jt\Y@pEP)<|oRQ=Q&1woeo/ LOgF֛´ h"$Ǥ[!MAcl#cm_ibw:5XXIPP㘞/L#Nj0U)Y֎o-$WPSO{S(zK|7\AkK n<kM>px)a=oOǏ̗?}'^ y3fT&2=;up1H:}#0uHԌCJmERP @2?nUKfAtžG M!d'2Yн!c^+"ǵdo_kDPSq &I!J8֗nf\r*7mxӛtqec!ZMgQ#»oۆ Hw2h+{J"UDvHUB=pUc߉ ;j^fմ;DG\4I#exkzDt6[u e.78, 4b"Y5z=%s#QJU#=s g?eR_RDuAD nWT@MDP_:(rIMq" $ZSt m&6c:\ BK kVu<.[Qp?N xc \? fe C|^tY S̑)|SaBFy>obtDZX\@o,<6CZ&z,0U9<+ѩJC7rf5tNaŘzi}d6/ \ZK_?H&TD«&T'@Ks;px95)REh"Ņ X"bM/x(E*{)Zg$8yWV0〞^bm ZQ6(zښ+iTTioSIJU#W/)q/U;xLZx#Q6٧?JW$^hr\Ƹ +$۬1hrڍEf9+gn7g̈h{/9V"1ɒ1ux39|r>\ܠXb n2F?Tš>%,u}3 \bpl!ە/N1QXۢr3miVD,Y13-YIbRRqs6eqD]JGM2Bh# Y؇!RmM#C>ň`.1kUۆȼ(Qepe,btzz͊:][Iʆm %Xl^ܘE«74ȌdD9_ِn6g; gs)" Qn&\!CFI&˷CZqo _b7rJvtPzy6E -u9V/v?QZMD9Kf[٠'߃}si]nxdʖ=+~]ݫU=ɻ˻ta }hAbN: Ê'zĨZeӺf;KJx@bp 8+$¨H 9lI w_{2$)ިC^4۴Lb@MԀÃU| .C>c'}:p ҤGzč݃`B}gѐ:! h5eɵKkCp?y[kx; 4WݵʝR8x1ç6J+;> g{k%6+\mP+b?1Wg˜LH~œ@jL1x!`t*r4DURcي1K&cVRONz6@Q"\Thd5թ٤OŠglIJ(6cd|WvVF/%Εư}v?P0 -Hlmva NƙpË9BtND=~^͆V=r6VrOKXY(sPti#M몞1Xڍ,>vA Z X(w:2K{yD则~m_uYY,$JI FTeG+ S\{ IZ+ !#Rt, OM qyanCå\.>7.3'wG{=kEP%`shYw@}h꾼߂-e'n3ȻKϿ<|.Io7V.-n8GO< " Do*N?~l~/./'S?'}SΥu~_S$pm>' ?0QD0;=d>Q.xw)>FA%54qk@&k¿dQ0gtjc%Î3h(:%$dmVݓQf8x)_e]IW@n2jAJ:YNɝ \r񶲉 J;LBzc&$Bz֣/..'%^!CN+Mҥ6I\m; >=yrm?[!^-/0\M}D[jV,x<N`,ѐY2O᧔HPٰp0ނ^Hэ td¡uG׳mJ{}!ȑV6/i"`H=)sڕ/N1QX۔D11"(FB 1 p3\Df>Xk(1@{9iBd0T ¢xEtE)RtC:EDޢsASlC&H  n\,޴O⚁o`n ]! rRk^%J FuQ Y >DcAN3*\JL[02)A96̚gK%DQa~:bĜ``r)wfn׹ 8emԯcx6Tzȹ4L9SuYp-J˔Y d f  b+ ?a㚛RE8*3wJj.w.w)I~(W2'#1k8aPqH0ȑ3MԚ+){xS$-CZOmk?W;O.?΅6=x".`ѣ3XчcS(܏| '#=,E<阢52FMP*GH):E7wrn7Jw qPNI.y)F"^ tp+2h3AY"8}v%S*fj#1%8E Zq&#l&]Et_hvN΢[9oc1=܉Hׯn.mrntOdLޕ AzU:@t?Og^C߬ᄑ(f2]/͟;2Y\@QQ*[ěUE2釟G>$g]>UU 逮Vå%+0VmsDϕ8xYѼpV4b HCWGky}tyG2>}Ǧ yJ1d/3de%6=tiT#4Y-MAsп3ۻɺ:t~yL9H`eػ7Z>mSÊ])hWm^dC3~hWzg95~ꀻcGtmEP!&)Y(gGY#zrU)}b^5\JqD݅z" D#"}|=Dz,L㡿 _xNC"_h¾".JG@@Ww#Q>*b0`p 0) (f[(?Uz<_d CRW_lOhD`Jq_¹ȬU$Ah(S98zC%05U7.{V+.pg/ L,Rs!R*hs0k!x; yW@Xer#KS, F[aK D͵^DOkWͳ'[ޫ r.Pm#sjpoPG=[ߧ&Рu3C"'n6D>T۶C!(vY  ge.Ӭ._[&Z(C3쬂ۅ)BY2 TJ#pm6w1x۟DC6 6wd܍ֆ*N6sRpf,k[x;LM]O}?7oZ X hr\FT_kX}>s+Q Gha¦(*6< F=f%CV(nV䪎[!Չ%] B[+%(iŶ%SJllT":|rcôeՃ >W \ gRJ|!3AYg65U猐6£cO{!1X${e/Gf$PET Xf`jonnˌ)3brhQڭHG-a6ϝk&Q?OwqU?qE:o[-Esc߲RU_ҋsԘy}w:%ON*S|aaZX'"aGjIJut?i*.R"CM-NZlfVuO{P9Кi+3"o.mKZ?셟ֿw'ip" #gd:n 1hc&%BGJM p:oR^۹1jII:7+v鐩m@` Nq3r@E9PrVNٛ@3,&gS^2 bIBHţpeA@A-G8tX^Ő BwzZu9Bؽ̑"(\!riŨ}Vi#\pw| #%TfgުFA%ᢖ +q2BegEѿCHbzdZe| /\044y=lF(κO]W5*(Z: ̫ܘ\'BMc}*f^M\= 5Y9$Sr.};ZKŃ(f2rI[h_G\$,Sdupq 1X~V2JF9[(g+ܷgi6Ip\f n1l%'W3oe}ww`ydC˗`2ͻUdGxivUK FH5kr'%Y/A_VxvVUK FNC2~\ t@W-p'Խۤ,yc{>YvZGHqXQ))a{c{=Hh``<-M/׾jOgv__痋ѲGsARmV;DD#0JDHB2DaeOD->dm\ QzMCX|4}a[D'ïdF:MtT#P"4ldQR>;%T,`SBr8CѐDF*0r :A;S $2|=}FxO eSc#h%PEhb'> dI pV`Q3╁ TD@c* eA/ Bi/1p42g4p ~\繖c(bH98H$5ĔE:p'rXfJP49X=N~_R f:?s[0w*$'Q_KJO%$N(G]K-|h`' ;0 tnK|ptD[p (=RD6D4F;+qyDxpE`*&WxFZ KkZ8T'&iP0vLH!EӉxVcawt6=O.U"[uz.j%j jh00$DXR-:~HJG$adp\0_ \9DT2Ѽ\`Cſnr OhSd;k=̾lS41K9lQ/'mX 7)zIf" 4X!iB{N(~?//[)3—=1o//ӻ~r_yC͈!R/ xa3+] *?>s|4?MnFEϽWu6b;Q7v5Bvnm=wJC",-`m5[ ͨ4 Їd| mU't*kĿ6\*C`O^.垐<)Ȳw}Laɵ!U>my2oϩD : )@ YϦ:Xy*XT|&ZOpn#22vb?4}|y\ }8f.m|\V߀Ǟw+=G4~l{Legr'hlÿF;niR?ŋ`*K׽~E¾ z3|?lwW=C`?lCON7=]Xݍ]-^@rYщS,BF}>L[ ?cK_V~S]oGW}H~{ݍlS_cVr;!% CgCXXdkWU]nyFTrd<[ߖ7yS*wlw695 1mxu' mr{oG q%A! #4#s9> wrH"=W$^XO+&/ƭ`=!yƓo6w3F{ń+ i+u4d?pbt52jyDWPKYUl^ iJ$52 Fp99͸ⒺRl*3!%F˰ݩؖFRk~ {+$X!S( eL ėV)V5dcs׎vAkא]x1ETwa`Y:5'Ͽ=} Ƃ\J(!U㞇**$au#*^1wQ0%Zamn0߾)HHHGqēA!HD (dT!9P`gjH+Um1x1$ hG4)J,C!JÞI7-n6v J!47`ꕳK6+ /e2Q[zP6hz Hr$XBkzˮQYajzUqC 溡$(ebj`˺C]`/eI1fwſp,g3ګ,Wln>+JK#,!֢0&T*=q`#J4eARI!W~ZOa9V͊hx %?j?N~77|k ~/\bwu07޽nzüXDMuѿEN$X_"Aڱ5>Yj(Ώ VȢ{PF!iu:\*/I/WwLr;|Zšɧ<ɿߚggq~\󓸉S۰2 7FWM]Vc?5Wˇ7 7wa%fM0ҸpQBl -VR7y K tpÀU(ntRsc+K^T 쎦`Tw-Z@@gyך91ܛg!1fBJ;!u9עԾw{<zN:xF" lFUdԲxz̈'U/ # >NUCbb`)MOEqOhjy2p(8v"PJ k[Bܨ 9r}yyR!ԑܬfC[ܚůNK)Kw g+8P<1xm˘rS~r6|`+tYs]{\6QDcĘcs.=Upe]kbc0nxǘ!W\LM\͉cDZyI87Վe_sp>?:9"Afa i7c`o L a8 2 j\OFWU`ZpPMe[E-sځ]u4"cӻfہ=4 -%!̳TYg,C,TJ]"/c.z_vbې1ټrކOZw=kNߦ ܸ0LQ ID%.t뚷2E[L77WW8 ₥{Sjx`\cw2'%cV:bI.VsPM-F9HqS^8~ޡ`,053.zsiN:HY~n`{3`uQc^1礡z%[ Ah{|ǛeB`MŘ͖Ql\m0*1?°v44(Ēa6LM'@RXĜp̖+pvhaFLfd/dfGt/t#%xXC)i–XXwL!f1aϮu:fX E9W$1jȵ:qt˅5=FP)%>eݨeJs ٴۨlϵ(ŒJu܍:hBhM8YҕD"`1189<p24j*(GJ |n%8[4h$]ejAp9M_aE6>6zEFƼ=77vp6#Ģӈ̦_ LWgwxYU$5?;T,K"M-N(VfAeꥶ:ed]DYR\/߻YU0lϖ;~~XNgG 7ێ[DDGA;V > OmH.k Ke}TN-Y.ş ˇy,W1rsNm5ZW͎.}]>r]&/ќ5j}#EZf;WZ׼\.,w\uZ.&M]|f)rns<՜cEޮSjz VMjv2%tgD$k̍v_k(Yky{p"8viN <0~SXǓO;4'O5YoY[IƬV;'x#N> BрQ@h㞱KSH4Y5\e} v R$R̸T&"4'S^^Dd2A*X0(pTt>5%%KU=AkJgg0U=%]H[hxcq3jiw#@.0믒mXָkXԿl, 2z|寓P>TzH c$qc{,io I;>Cx7lP !<ǘ9pPV;=,۞ ~zWYiiZĎ|V*MG9zGʇ26=4tQ TxL,\|{~pf=x`FdԨrR Š7A)NW7:WLT~MgBe{ʹqZ)6I>%D 4_ʭ1hR[D6t&:7@*!orcl{J%e&7)LuN% ?2~<;8&jʈ>;''epR'xڛ!~d]lrO'Vm'jna6p^e3d 5e1}#"} JalhIVN7.)-w--)eP94Rx.z PL\UO<~tt&(g/u)g{xLWp:'4`&]39XYv,Ndj,jNIro:`B̤=&`.j,sɻѸCB(Z0D &)dMpkw1c$?ܘ/:V^3W;l#hzJ4&~Ӈw7Y|4 ʷUM/ z\! 3RX? ɒ zR^#!yqdz"/sσ Nj̇JlzvEGqVLǔcs_T,itFO~4$wwu/E cV+0xBDoZ@׸'_͝4P{mӈWC4c~ZEvESǰ~hz#ؾ_q-Gs7о~nj=4NBjn\rsLpR+w*#K$]bz҉ SԐTjqe?k%pM36g>γܚ( pt0^My@sŏY-| N8n:-3  X*beUF//&I.%K2fpXBNHv+'-2F@N;a]*iƑ ." Izv? UNcF[Za-L齄Z q:nm^\~#xo:Q Ơq DR%bA 1- JEXS $YA6e]A,4ZɵP{.*zS=o;Ot0)?=m[0  2X y~Ί3jr&Z\وSdd9'\S (|[V\ iuNΙE}Yn?@hxrFώG#hC5 F0͝m8 B G(7v~YFjh/:U_X_( )bі{,^{M _AȕhaػִiX;Bnl9j!)s ۷iQSĬ՝Oǡ-"b'7!VOU ~Ӂ '5T2uױR6`N! ߑWh!đüˁa-5e*E,9$6xa˾:^\KKBUI-mky^KS~4K.)tɔ쀫wAGs I9f4"|M`{j;EFB cYRED;b z/]&֭AۈkZ=5)¼uyrK5o?2O#Z {uUk߾yp`}ncj_>e++ ڶv&5?eW{o}X/.3\{@??wu?uY 'b`S_ߖHkx<~y)q~Hp8j%Qzr찤\ZFvLK26qVR&93,%qnΣybM<@/߼^>Inz;#;G jgOŦ-^޹>_pQCtx{Q2sN{*tXT3B?w'>od|u_:o?ut fn-f..k FѪ:Y@[-7h7V##~^k))}8>҇SpJ*}(,H XQζt*ʯ1Q]qS.sQ]uVBS+y,)JSDK0!0")C1g1Γ SkLĊ#>]iDbt/q$bEf'!:P2cSg[~Q-42ijAb_m쏌<¾z«j6K^rd*>B oÙVbIc7v@CRwՒm-jX$] j v_]P 模jeMͅ T-Q5 RpYY Өdm%oNu%WHDe Vz.LjuYGԊ@&_ !A\q;mǿ) HVYȈ\jT|3"Xڥt5~|"br_D)$Jwp^ 2GbϜC3rgf:9BFrAur9]ܮr./:Yr+,1  cBʊzׇ(Nc$s6ݵʂ~FO~xX ,HpNbM46iB ikR&,V#ʹ Ŝ{=E`Vh`lY1WU\:bFcÐ,1oNQGI<& )D6#;9?`U锐&UJR#N' ־6*dLUτ3 U0>F0NifYN@eJ<(SVpiGRuJbEhWUhL]`QndCC_?r#NYF޷YJH_/1y=Yvx%pJA+0RrJ[++ zZU?_ Y *NGaPqKӾt([A 93j `o)k7LPa*m_P@HRMsǎ6{< rZsvy~gj籥=Mb&fqrg ~2gY:>F}V/Ʒ},?Fq2$)d>/_{=y=d=HwWCCԹ~sŇ^ _?=\*MN-&"GΓU u9R-&%-1Ek_AWкf= W]`>z0M&Hb^W?csw3㕔VsXKƚcTCMdh=;BmM kz1s ҬdhĄQBD1MRsg%ڹ/a"cTGAo8J#VтqV# g5A;t> @SXvfO_96ٮd:o-=BκlLט,=%l8JljA`#mtg Ν7Fgtaֻy s޾DoIќ M₩w r67%ipXҌkLs)dxWSR֫rpj"bwaa+c9R^9RPmKE|9 iKJ%#B]3-kݸ7QZ!*4W=λ"6=}6'Mr55D*\OFpũcdy>oqYpz>L,ʾGR #H/.ƙnj~Yڇ]_t豐w~ >w4iVvhoXcD\>sw#IRP`ݚLXLSڟBh_e8j:@_6dž˳]kٝxލ }.ahbz}Ž'MpDP ES$!AI+R!wmI_!now~wNp݋ovAL%$e;_5IQCQ4>$:F,1]zvWXP!:w$GwLIEEV׆bGpi2|gdW{KܽDŽ&;м`^3{+)ȸuŖS^2h$`rC &HGcZp45{K[\ &=Zh;>Z*>!tm8zPUp NGZfgwb맭ct fES;lk9 Vgk!mݺkuR5T[ \1 GC /4]aѿK,ԥ6'y5=ߍ:i8jwrJi/[ sNEs w~p9UJsG4ΖM^;Ɣ<0Kko4>m ;p932tcNoW\2rqN߇f'Fug^g"`<\qv Dk)A>$4-6(bPecx%TmV{U[+/:k-#ylO0F2_vH \clOatiQTluS͉bP6lZ d_мFM010d)GBH|E((e~y3 i YZ"rbG~ݏ6w&gzZT;X3ݳvxﹳgygOH[X*?y:yH%Lb_ثXA\WWz[vk)Z]xyZ MyѨwus}wsYb//jIɘMljEznǖQM}Sf; ۑ nYq߹DɁ!6'}va2d ք٥K'qErbtg/ ci +KOX2LF֞̾1hB%&MN$3><Ju~ՌiI9]1xH3J\g2Y{"H ,߬dz;r" qaLI# :GFG8];g$9z\>F]<=->3z7H ng_?&h;ڔj[a%3Q5(UT&󝫈jos ԡlfX9Hڠ@oH"q^-冃>~4F ]B B ^0V+ JXugNUzvf+pa^lӹ3,E2Vm 0%i&<ݝ,JWyӈ7:8 CE1/UZ+8o#?*WS%<_nV{m"#wa7XI?\P%!.TVz""]I,#}Bl?n*qJvhoы(y .g=+zJ.(i CuJ ]wk;g)V|gߥ8b>^٠V bLP&։ O2vA=:_[g[O׵@L5-azJbdx[[wPP?'/ANܴ_v!+BfVn`}N) +1ݞ[ kDKd)km2~i VmiSjhͰ8@36V2+%`__o7#n\,OQZBky 2-6.fJkHBS~"Sy]nDxII Q=Ye[kZx^ޯi\Ԟ9@Du>8ߖ}vsES" kQ&NAE)k<^s:KmFhABjEQb TV!єk/iiw{ӟ;sӾDK/i2-/>>OW/~*%a! *4 *7fH9iλ)ȴSJ" J3n)Dxd*k|P{PT%CG Xo<)xBag3ugu.AYd Hf8bZp^:}-|Ƣk%ۓ͉9nW ȒJL,CwyɻYj^tyeq`6)$ɄNεq_C^t8!h+ ?BĚ/ :"7F½Έ%iŻ!._)ٞ%|Epg>w7p\n&~_mC;cYzȤ zP:wuXeXdԂpۇMu>%UX] TסpTJӊ1UrWI 2;2}`XT.LolaD[5`ן>~jS)vaPvh )QE*(}tI'+'(*Zn7-^@]\ntpNY!t: (K (u̩ C~_ö(qb.] Mcp}= ZḘe&LeNBU0cQOY1a 6*Ul gBʶ;D-@SK-"ލB)-|A3%4's'$Λ?q 1_޿gu$lGNL`BXRR݌c9)o>0yqQWxha B$ %Q K!4+XK8d/% "\a!rRXZ|ō1JT։]ZRǦ4Ώp+8bFv* {hģ$hƘVϵhG s/I'IWH4-Z0Cw/w 2@2Mn=ke;WCc=qx%KDs-z}r7 &YҥNIl`J6A@(,I,QHQ8Ng<zPНdZAJcDyRlO4THX`ZQ/w+l J;iqD#;$5ʞGp!O6S2|GYTkNF|PdG{>U  ce9쓾!TN|3bΫ^ pS&VNDGI qS1m4M$/eěN4!\cv+AppQMQڨ aMx5kj%1Teub|3N7߾]ʣW"Ns;2-RlJN0#Yb5dA[萂*y(@RM2AG$(ẵ' ѠhɜAC<@\7rQT^EԭYCApd2h>I4+VgU"axd"OivR|yDΚBris1KZhQ4E+yٳU)":ؔJ @QEGpeo(xiC)i"1YKY!Q^K2G=VJ9{VS`+b_\XnЮL(0PK˩MlʼGkyvf U>$- Lh8f{=w6,*O8]g\:j)$ ^n@$E{WҩMms/mɘf&nq2/7~=ԁ@gJp-;/ozeϥ6ԀM('ӆ%dAb.FME%4@`,*0T8Fk\iBu* G/a1"XDJ )D :ol gx"`rE[\q0<&ޣ֜hIb643 LQ"-9rFRy:}Cu"ksNYc/*uq~8ȣ$9^ f+jFp%X (Ei<QaTTY4ZZj3բN\tBRR2!Wsr ~ pHzu%(f9߬~@鉇*pGr*5a j-BEU6́h\bJ;U/sP2U/sPT-w>s%I9072aY 0MD7] ʼnHZ2j糺|VwWu>d(Tu eӆ Y*[F%Lے %SJ.Ay 4Z APvJ3vVV-'"NЀ+.<ꉋh侑hNR%X v2vtcry*`z=ZR"WxHh2FDO!&=+0)z9IDL\*Cp;BXlN sph 4GS cZk"UDdйcuƒɵ{tnٔr0|Ѣ)sk"b=0%{V`ֵ/jgDfI{*8w((27׈rT+jȄsI)a`WQ4yL qhG/wF"Z).G!2yAIB;%~ub< hExK 5%иռԎ2T^ha%T>IKmEB,EdICٻmdWT~ٓCT):'gSݗ@ز#3l߆$KDJ ARTT& ح5v 4 \(nc~U9\Q;0񹓹vۋB"݄2M[`cd `L"=m%:%,ln#nTn Devf'KIrBF9/P}\j"9UIޕɖ' \E5 ) [dڢXLIeЄs*}N1%V^~ tVU l>$HDt;O «% ֵ:#Kآh% X6(V=48T#%I_#{SeCnUDТZJ~'.bJ+ 0$V%+ \P' >eiחBJj4q RI_A0o,,+(f!g^XtXExe/DCmp,v R!is %ɐ;F;gP;?pA :^ ?$ⷬԫV_)QS3(<$"&A)xWⰭz1سp}x W_Z0 ̭Cg6cƏң?+t+ǎC/ý̺i 5ri1鲐|6{vl;jh,T~sѠ7i5srt!lmg\T\຀rc;τwسQ%<ڎQ۩-.N0Z#EGWb5G9@e~i0$=;[3NyX98i<ƙ1N-* bI^*: nbGxsm'[D׷-/YnA^A`Ԡr7.]9R,bNaıӊ^E#=cG@&Hf)297v(\Ly ^wl%ϪݻD7u6aBRG1U)wk $)Vܭ(muf8.]f=n;5Mݺ=fMLiҟۭ5fɫKY.y%䴣>Q5Z}v$D7_pommH*Q*7pc~on X4YR6.}yKGTɹQ-#8tTӸBi تV CGߞj%ۖ%cɫKY.7*.Nx-(+`n=sUP)׊j*kV:%8& #׈pVq%7nO5%jOD,-4{#tŲ_{X~lO'Rt;{q>:xVSڢttH]w<^ ?{ Jn(˫ٳ]B엓{ZyyfnngQ$o1x\jׯWh(n u.$*U*Rk`-[ivv=<ӽM[(rA# hL2)c5uJ1hT bD'u&|+Tc9vK^hvBB" SJ| NoޭEH>|x4Sa99@?,Oqpan`pc>W0B.ѰGj)fvo|y9)7sYqc;i),*]B>^Y޺Mg-M?d,cͺ_$NbcY*uE<~+srV-Rhf:\hzUUT\Wn= DJ9.:rC7[d{T6bXA\~=ْ!:qz S]KW ,`eR: +)l2.G*Uy~3>7'B=/Z㢘R_蒹p)I|LXT"ĨR3 cUhSQY"Ԑ*{Q-8zLjoFXKCr;(yrpX MKŁֳ8Pۓ/ếk cgZS1qfBĵ9*1_/풸ᇍ-W"^G]3ǘ!WbB?Ԡܼʩ窀+3rU3`Nkx<%ㆧAPPD]NF4#H]xݍΠuX,XŢׅvfk1pKtps5zp.VUpG!HPMvQSжp"R4gPK[;Ɓ;Y,$ S¦v*{m ŕ^J&%w%I03oL1I6`(shjr0$ !v4'"1cϥ& #1IBch=^Iw#*WxJ}7;#!L¹{ qEjf(Hw -i؂ZI.ۇH^]T&)7L4ALF{zŘàlY1tp̐h\,<깝g;9Ӥm@&5֞jMF T{YK~{}7"ݰ7@t4vzYڇ %ܕgTrs6K Wҝ-xpMRݔ hLۚ-$ǡ19iSJl|O!Vu !pm-Seڛ?'jtw Q+Bع:tJ/Zfa,?o)DK'ޭsz ,0iz,w,ϯU)>BU*ŇrU@7=e%ף0ZWsFvr}sd7pTj}>`O! ,Udq~'Is/nz}E ղrx]f`QYL\0D)b҅)Aaey2!Yk3.\I;t@G OZ^E}e΍9,ۄ \@#5sX&J+-ѸF,d[ j4b!+͉ڜlY3x,nTWe7Wߏd Zt hF=8GHuۋ[Oƭ,_BJXl#+ ?>٣䅊p8{Eb`%1TFKx Tx;A c: r%0)7ݜ]\_ΜoV#mؔ)ͦZjD|u-f"V2 X 9`Espp00@ i,yܒ%9xOļ'r<EZ8`Q% xāDdYQV`nv9^X#^gN;tbRŰyUKs$"[9Qs$L *TXaTEõZ)*!dJ4xo';c-A(WxjY:zn\[<7 mq2.XQ0 JTZaX}rX V=Q7g`_ogPkk񣍵&i' M9Zm[|w:%ǵ|pQ=,bQ}#ࣿl.oN`|bt3l4#?hfrm|uAw0D|{ryGı&Bhra?wo_l0[3G5{Ł8!۩cUYyݶRG(c IYs5"NY'=jE[]L054I#ILI,ޚXqQO01oL.@I*Tm.]2goMY"92)1; (X$_oBjgfF͋nAܢZQAJQgt;~T6 ;=Ժ%$䕋hL5:uU'oi7J:-).,3}k\DVBB^-SU̶?KTB2,D' HsҦC{F,`͌#0gvx&JIP95S^6a9{z, y,ЇbAV[!)ly!ܔN *aUc/US1U5WLlBYRS"G(\yfqx\`JLQ_6B;U0*σbzyYys!F$S¯ umњ,+1ڜ<~vC>o;n|'qc-" [{`XJWkTDr+'J?Fj& WH~RGvUH<܊4ND`&U\=# 164+h͙ 1YV&ijreʻJ'?@O^}5=wb|v'׷˛)˸\ޜ{QRw_X~jf.&Kw3yxEm _yd"ŵ^^0Y4o؋Q|wB?[=F4)(eT>uZKgܔ $Wj+ kb PmంK .1"xRL?\x]{4]]c {:sZiuVũuVd3~N &"9"sCUxg3ԍB.\=w a=6&5ca %"1Ym Va~ f2^Ff.kp0F,r*^2ʵmk#lCi-bT쨱Hۘn\ݺ~cU^ܢْܞ]+s[a'3}^c 30sY84bd]ˍOʇ&?XXί.gP \ SvpY(pԷeI qQZS6e8Ob*q-m39耶R L# Nś;LKcUp$r8diA};Vv= Ăw O6NA#.N&dFnUzVwذbCBs>tc S:p?ĄS ]њ7In }<n#+ >\Kj׵t5;j#`4ߕ͡I<)p.~aLIWg]8-gK~U3O~xlV2©^uՃϏv2Jq남hv E(pO3!GsN8V!mB9hdA^Xw!WNB˄V_|u ^rH Ƨ;65A 嬡'[ 2ik9KJ!N%H]fkՎg|Oi[(= Pw Fo[y8a U!J<^Ѳk%ay/Iʄ,_ft^uf dϹ:yZ@!]ǡg 4Vntpm)q(B&]ݱ5l^--}{3x sC%\HE'a:-F?EA3H2,&ΚF^S9KU! Bg{İL6xX#gp)sc(v#2L,S% @$vIFxZ `B/J[͕L\fɎg|P >|钰j/ 8ِ`!6RBO6x :B{,'aRC"뉅$4^,JbzA؆lV5\&_&?0u5(X_-gk~~ O&S͙[׫]XOp-!J{[}S/8_Rc[0-跃 넋f̈́p^;ir/f|Z=m^wor/ϝj}aklJ}NcᄏZ p d8`oow4l,ӯ7*gG^(Gf^Y܁N>|(9~ⴤj9H"K`<`MuIZR2z2f./wC4h`j6WSl$ \ u5m|!Fio7Egok=$F?eӌIyʉQcDm1b[!`}?.L7cfº^j`fz.͍U/mwĬ2dJ+ER{_-':,]L'#C^wbp6>ZgU'_V^ f :?&0*d ph. pM7Jv2J5ꀨd#WchTqXlغrxEBm?(/28| ؍#Hk Z^aôxLʮH]_:@(z\UK}H ROG S-<\GޔPTϦCo\ #^p3H,+Ѥ(nB\ (2^.Vo}a]U~k icV33Wz㾖}YI][MX,6 RBI&V)(QAY)d ̍Վ0g1 K4qkH=@xrXH~Kbs{8{9RZJsem$ bd"*iDa$PY池z'53+]!*,WeG@a+ñp,merL6V8cJuΡf2S,8yΫ?Ӕ -ryUsF @*!"+QF{SfRÕB`v8P!f޳*' F;)PCZUQ,g\+BV`I@ʜ "|~z4qy ~W~r:埢~T~ൻ&Ƙx=?j-NJU~#+Nͫk-lMd2bp`SvV).$A {A]-oP 1.6*rz}_^@_IJxJpjor͑)oKM4̜j\N3h#澵[@S[ y"#SP;4iT%:vA>v; RnMn%$䕋hL :zz[Fj\N3hCHh0햟JH+ѽe9涁H9ZؔUj[(+\{f@qD?冭ŢlǕPq|sI,o0Zk'[i"_@۸Dxl9i4]mWhH|Fc~~0 ?B FH. {D֜5R~6Y2 e{OIɯf龮31=4~?^\|rjbr'Sx}W,-=uKbҳfzo T!đpx {aIUi.e6N\3~2]!z 8}d\`=r=cWqU[Ќ4J<`/'z7'rW?{ƍA_.dbM$] Er/9,ZY!x uw!σڷXa}:0ldS2yD$=0"V*_h B TXhaISqpB.8NR Z3BUb)7OOi:!1M̘fakID5#&sH@D 4X&JC@fP=K<2qeQ`A iJ *6&j],TGuNI9rAZP’232eT ‚qjP%D&GH~\- m`>IlW;suiS7;~{X,,'R:!a릏סoK@ _b؁SxԼYN dadYie SjRXf6jMeY۴UВ@EfkOa hqIUww~>}=_ǨF;MWH-Rx vBeJ%\*)ƌ#%8NDInxx_;T{>:^j7)roK1ؗ(6'M*?28֎DŽ d! dɩamB/dd*`F-2^Q$"&p32i )ܪ*bBp垈!ϛ^l ;V`blXUHoNJYQ+ذ:۱ˮ !@A8[RlD ܿsHs S; komjP1w*rOEA;JT*v|EH͖;*6ƳDHei>:Ϊafa=Y3D^;JFù*7'=H#4KZA詤{Y{W.#)\ěcׯ|w5cv~QWX=? ?ϲ<-eȞqQ̏rkD pYV 7(Φ1i=ѸB}aeYӭ3J\b,ϕLZSm-+MVO=Hh[P"@gff.ͳQ(N)X0ԧ~ ]%Khhy3c;*-[TCNe*gzUͳha;162c\iZ}2c4{7[?m9H+g͂ AtQp$s&1a'ԝƤ@TK"zm7VcTY)u ZAB^ɔi_[(>:Fݩ婓޴v 3n1$䅋hLICF8F%ۭgSy4nn$䅋hL9}==nf>hP |D't*ڭvS֞bH  ܒApFs% ɹE0bI~c )<#Ck/Ju(h?!J@0 QWwΘO`O+!O`O+֍^O`O+APz~~~'\ MOO+3OO+38烟0 QWXfcYgJnp8v(X#X^csB/ǣۇe/nVOHKWKo׿5CtDO)hNmA x܅߭2yҵ!` ܗޙPno]I©OΎA|qKǗz䗰>AE4v 9;#Q@|C듮CpaDŽb}]/8y-) O+A(:E=B>YdCsZ>wLkzAf%t +umb6Mvԍ/\1ֱ Rt2 \۫=,I)8UZ,jÖsWkϸJg{_Զ0`l Ҕ :92ƶSlRJ4C9@ nVĈd՜(쉇6vp~ZAؠ7m+<=ֆ 47o\)pa _L$xkVgmF|w9F 7N{`}xzMw#-"k{A>%6߈}9bv)F,x}L%f,Mƣ =,]vbeb&$f=Mm.(BOJ?~޲BL`ipO!S$ⲥL;I0P2 uڪo1+F7f}fvR/oTwS]R$_6{[ /G5d򌄷$`c6"Ql)<Yi6F~%!,-@55~^ao6Pٯà^f2lp} 1[ \D#NOY`%USjPAt>PE?ED3$i2!ЉLlCaD#Btf>Ad)N`䐂\Ir+mTs3eBcfVH2R# A@D2{=$ILk&93ۓ!/NP0{2=)Nf:u=R۽ix^`8:m~iՓN } 8~7 CZƐ޿߻z}d>C.k j^DSlPaT#%qD=ĖIZWEV~g m*y=|vpBd4ġj(ʟnd6g YT+OZ ziL:`p=cW7E@i3BfD8vXI΋g9A.[*Jq5߼էB0@R%e2Sm8@<ZIؖRi*r95O279&!l;IG;oزa4[/N"/Cg29_~0 .>F"У~)Y3B% ^uG)=KLqs73-|% E g3\Hsٜ=bQ>B++*ܣ'6~GP O~ZqQo&!V6NZ-B,kh]dwCw ʺ=ۃp EQ7(O.{dE8 wEF.Dz"^qh8lk$t?'gZЋTs3:$?G>u=E+s8;^`ǐ׉0˹N!"Yh=Fu5C&bfvȗz䗰YQ" I/a:S"c8XoEv^8ϢAqџųcO_V]s,9  o='s$[f$Qc4e]L\vItG o|q4GnܻۛM#[oefsB.?.muG]Te>5No:nf~ #I_WZXV"t]+֬+~#xDDkqi+tNF%?Hë֬g-JVyݲch$ y"$S>@IBD[(>:F]T)uNnᅦ<[ y"$Sv_a脎QEuax1jM׆n1$䅋hLQC&$v GtBǨ:R0~дv n1$䅋h}"ڧx(jڋR '67HI[DhKFg`*=f{5 ݕu jI+T́C_a{mn}HMZ ^2^W~\9rWdB%Y%'lr, j巔nWWv}mopRqDBbz~8 {N<x/*̑1Β9TqFթ}V+ { Ybt4t@Xȝؐ 1`*q1~7~o~oM)2`9Bm;1}W]AĆzX8 /I' PXOppÁq*K !m$M"4o(EB6Lt/Y% MzO44%}>S<c3D ANwmmIW| #"@O,6؋<D_-n(J);N!% ERjrPmXLל>]]-*LTܢך[yR7X qf# mt];Elv^YsFDzbuYp/Yצ yFtgѯ݅L.:f&+Jxg_cL>,>ȶP\0KVNŖ`},/+ztR/=XT-/8-z-z3T8dm%}TW(: RB-'fh3 {iXCT! ;/ D҅sSs2v7q<8 E[+aN90aΉw!|)8TxshXsB%! '㦑@?dMZO~;8㋯f.fJ]`?CE_NSIؓp I`WJU'WO_w'xzU: Jb ZpQ'E c/0-N3C % 3Y.m6Uy.6tV O0%H xS7n}L'HPR;of3D^fLz>tҚebwܜti{LCuVXF@w`lF[*e11HO>(U2..p+|gd7*6|+t65xܛ8mqٞ L6PAp..ϿDŽ ٛzCiJ`.t2i6 dcɼ J"SmŢ'HN3'L~"^`H `sziMdYO#`nkb'ON#Tg`%ϝNՎnoBqu;~ )&_|V 'za1q'B#J￁oǩ8WAʫX+ p{xc|^ Xu5yM>SyD<OVJ?=X4a^fBC4@ʊzR׃Y`Z9g0TKҥfԔ IUcFZM y|6N LMf]["p09H ؘf!<ć*zHVRr|3AP?6[ken|k()jv}J-?&F1[!i} À TLuQy.{"Ns-gAc%% +(b]"6:5cӜ/ ^`9{CHxE\X hLޕ+Z8ۆS" kŶp 6]gzLRF^k voQ' !Ho6-X֬e,iEJdnKNexw0!h"$-L n"B!B2ÉsjI9Y PjyJ!kZt8T;rӳQPR%{$R3vyܜ#ԪEf=cvXձ Y'Ͼ.KBwoыªs_-єj׽N/i*Lmj :[=WG6q qoK#WʱET֤HkA+n`3 b--C )Q> R*Dh&`:4񴑷5=:Hvy%SOv[o"#BRVqj%]Seiԯ $]9tsA?Iʰ!꡴^\"IČi+K)E9u /^ 53NY]@*: *LtfYupG]C, ] ${}d!VjWG׿s-e ;5`X8+uwJp>Ტͅ01SwD\Z"]u5ULM;J׊m;p$ kBHLU(PɄ5BFb 1@`g=܈Ͽ Ә9JXƁb0 %awfXى< ծ7 Nv&@Ik/p ( ٯmkWcLK="b*GHSDI~SƻZ;_,`h'h HȪE/zfb̨+'~iFPZ6t_&`\ae|0rh|iƿ<g%/ ֎)Sm{{o.bK3D{@"y0#LKE勛Js/c?=$?f:]SȊYkKf/5N1XS=sXꙷG8Fh[A/Ym3n/sWֻNjYw=xU39^Bرv#Dhq $1vӦβ rjDcBZ <#kho_kլcwxZv>24l{16X:?DŽo9 ęJmysa,N.|ENӻ;+—0ܽ˂. W`iAtz~;%*Y2xS)"1:Ԧ1p6m?i ^I8\11BB 0XDFX)̢Jr.-]jVG32 |M0f<%EOn罧\FۉkȩvUr#?B0b n4to[)}䄡t;i<.}\hGGDˇWx23!`r!j4{<N8D+,eh0xynK(=T(z ӟoG#xVzÒE(;a!⫙wC)GX(wid1IA|0J/!!+Tĥ3Ju5F]\- n֒Y3g'6W2LV:\ #a7vF\<п?pjz/]8**~Mp2\0RwtN?r:z9@QW0RXaH)J+ ϊw77`>q0pwİhYbC xr Q/B8L6E5u0(q38fQ 2`Ry7ݻ4~J4~JӼ릫C:ք(&# OFbZSXG`1t6t6REjc֫!BUDj\iܔ! bܳ'V]y֞+:c]xǪ}OeilE/Jiux̞ #vY@}#_VҪvbn0>CnuViuVՁ:K4 rh9i|c<cNDg}Mj6 ) L癀U3Ue;/|q +b$&*y!6АTʽ׀xvB+VRm^G,蕢mqX_3\O┤H Sp[BH#NrҴx 1`"hA@8QSG9%5cFAk#1C iZk =l)g5Ֆrm1*lS)"-Qy5OՏ|nG)TQ`Ò 0/#,8V+ieH)88k#2jXeS6\lXQYXꕂ1FfIˉ(9*4hI#qEIb*4J!k}/[@J뚉 @)MU,M|DȤ $iOB!u ǥJJ{ g.NhճF[\9iM*?=8\#i[?NqÌ{^ڿߞgmW=< ݒξq˚:p&"XtAN2X,.Qr)Ks @hI}Tbb.V=qjhH)8%?xK%K@:E5 K^5"j5BO +|KeAsKdfǶ5*dϫӅQsݻ[,m8)ӨR1cs1ch@Y@L<uRF5.r\0jPN( IXmA[[(._L7`f;\LQ)fj H.͆C Y<) 5̢,dpAZ7..XJSwFčQB7cça|ܯ1r!^Pscm g= wQH:֡aѱ5cK+UDh%!+,AӉɴG+*i+;gO" röAJ 0[S/2s8%ZUhuNZfTr)QX%T6sgL*tyÂyaP[xETK`=hwٵE"ƣ+rXԒHڰ-i }FmxMK!BB,zC5D |Д.gEl oQ?Ar_DE8t.k/L_{)Xfè)I%^pW+>[Vmu^G2˃Vu,|fٶ!E;Kd}m!~uیKehV{ףo]z=piz}T2īe/}OF\ܭ,EOorp?p˵`o3_&xcʻ6e?ݻqL7*̊.vF)κ| ]nE [>Mri%H?.mYd'l&h)] svm׿7qϧ?Bו[Dz"F!pY=b!]kl1ޥxz.nBd%A]̩C0&GYmWYqeB%˟/L]2?gc%WQ38ϖ bWGߏΏ^[#OwntJͿ1?^m6V[~д8Wq#~7<\@l|ɦ̖ (kqNjct6?ɤ'l c|Bղeev% h#h7syΨݚbPEtQGoy 7|ﰢvkCB"Dヷueos& :H5(& -Ŕ9idqPQ$Jax9nD֚i͗dvs7MoV]͋J7O<%_m|w\0I/0?Ncf-xr\NVP.JU=V\fU^s%w !Vh찢mzDX%KvHrJ(/6`ܵ"Pň/,^^'L]v'B^|Urܺzt>Z!^܇jsqxոXDceI4V&yc%k!zvtw>K48ޯ3q4(M?.ی#5"Lb2U2ڮ,mq]Ab? ]S[䈬b/X%U,ԳnRp5pcCjֶπΙl]_s&eWza2i.4LFۜlE!ɲD]'ǽ"H6ޟX$$$$/Zl00l"I)>,֙4 ʩk5(ůQWT~]<[plѫ*Iw/ŭ?Ɲ, (Y%e*zR5pܗSYcqK;X`&IN jC"ŝ^SzdbhH`CW)D{ G×!3d ц k$D š9p+^3C9Vd8^p,j=yE_2Vuw4N%̞`:wT*h+>#>J2z)US@s-%k3u'VkնOK!݄Pi.PT>=RУ:(Ft+ :$g1mUIY&iT::nA{Y/TJ Z+k/~{w[I( 1RWfH 5*"uZKAfQix=hDFSoF$4(9i0\ⴚ54teVNY n*=j7qՆ5wYhz9]X.18*'@Ω,'×x>w1[sSEO;VaZ /[9z˪TºACndǥZuCUvjl/eT7VEy/:K|?p5VvNb<7O~urݲ0`e~siͶ9>7YFǵ}RWGU)W*6,[?[d0fhy=<-uX<{E5\SPf}9a<* nz%_-g6ڈ\b9TWXtraUj*hF JY> -4۫T.|뎏f튛֐Wi#yB8| T_pԝ!=c[/49hL9falaaa*JJZIS-(J(!bJ;}RYQl˗!"[1aڈ%E{>|6!ƤS8rY|*+k4HR6:LqwDN޳#Twx~]#?]ZU32?BD\9zUefO8ZՊьspT\ߞH2p},[w|I@[&ġ:jU9ԟ<ġCoș󭀆Y( c30my܊rHV+lNqc`X,zTg\%1fiNJ/8.14fJtoHIhsu܄mȃ?dFkl6a?=N7j/UZsKQ-3ȜeɠAn<3Z(thdNthwY++=9cd^\!-W7?>fN򳡝zP 峉[IOӱgI"85BS:R**MW7kSqKrLMK?ga%Δqy;eo =O H2*͒i׬TXɍ:/Q1;e@דZBU+zQ&DI(ra") HOXAS )II+.P4~y8qлj8ifҀ]JZ!8-0 ^3Fĥ[%MT 3*q!{8D+MURkBZFP/QhFIًARMԅmp([ϳu>:_qkyDyDyDy-_`@ .B,> _0PFE5w)0J.R\ѾiG*7|O>hEׄ F)\lX[5665@('*=[  UH}9Ez{^t# 111q$I=mnSm<!am%ҁc҄94SgSglPkԁN2Sl\T7w> q0{]Ճ#7,n__|If7wfM\+]YogDHIYrsv6 G֌38u܎c:IʒpukN9s*hj7@Rc*%5X,0%x|ՁXb.T6AhYrMHY:ڸ6I Z5T:2RSzdr!io6i5r'"U|SnX4Igy/3uzz27ǖ?ޯ}Á{vU7(X b?7rI_^pdw>lˌ7=q']m-%c mTzX"OhfkđFN?b|\m`L]K\yn~zWWx}plF Rx/HgR`]}{ܔbb ϣoˌXˆr͓KL97h,8] i茘3!޳55KlGed7򗪳nyƌ!D3cgOׇ_>~_`JT.񌱺\bq`p#f1:m;m%a ˋ7aYO}.g  ńLu6S_h?d"ƾ1d9bJ& {.NaI;iKFgh%Jh\kM5 Dic*̻ZuDZO=* r8f1/ =l?#2 Rkq*5!E(}(7Q(iep8 x۷h7Kܕ@Bxl58=ݸhLGg ލTkXe]CUETofhJόJY"4j\q(R NRfԀy$c(m@ɸsow6#:(aM"M}s9Iĺ4;;օ)H_\3(tkbTXo qujtJ@ 6_eG||MVzY3?=^>`Nz"ꉥ\ d!_T \w"snN3x&S`y ӻ a!_ؔ r _jFB:iZst$I7 EfRyJrڽ[PqMG"b!;R@UXAO3P.-I:יTCXWnE609|ǻIfJ11gtnPj:Vm MȦLحx@j ?MqJMSŐ=ڞpju^jT 3eJyWJw9QNiFDsw٣QGbnix<Ĉ)΋)_P+4._e6^L2C805^j3D]&CQTc9yT3@CfT|:3BroS&v1QNdELƩwOtT :IY=4Gl !e~}-Ϙ5hUҌ(}z83>7zͩNO}+f頞D/; ߛd\_=1ie4Z C1̢G  јΔ;sREYP9v7/<, s /֦)oR4Kem$˄T? 딽:皳˲MAQrȂr8IA2z%֣h%gm˩PCSwVŘdR4ktIWst .HRRXSu+nory[oxv_{U_%k˷'Y.#;OySFvȚ>U:'—eH^W%8T"&hw?}3U+"K}HqQ G1 A+09 n=N=ޚ}u2>z>ĞM cù4Ĝìu[z4;n#~Ҳ+gX 4~@S)O4z߯coPtvm_^@KB%I _V>cPv/)Lt_DЌlMVe6!TgDsN͌;}pΛͥB-B$VFTZ+ql "D= R8 )> amR E1'lPۖr61Uk'76=K|YlNmP7y8q&@eWT ^1o\T6MP PVJ~=)4EMT*}P*}}Lg}K3S[r־˧+|ɑ^>/OQJfxO`x**SBrJ)|&ڂ;b!tp WFypxO!T1B`+Fq+AF ilPEXq:x0&VqjU1c x!J-F=[XV%XI3LX ?JBTwt3׉egg Bb ݨu=75`(`oQa0b*:5ېD/T:Kx]P 閭`zg2M$%KaLӓwWz%oMbR(PL!Ctݙ̧@b͸eFoScȊ3"Em2dghIGօk4}LAy2U2 Շ`/W~j-DՓI+2gk#y%y`>~dX5uNr'kF::nt-ay{ZQnWo O?Vo߼ Ӽ3?aoAk&s^aHĽԘVpLH,Mx˓yDf/j::T7?(5w?| ~*ʢx|! Pfq7# l80j 5B0:izfaN#-LԪk{_uxmOcN1{"|,I5T.v Y%m(vom+#::jP׆3* +'Ue y} RW;Dtjo-U&1nnG<07e<-WƵ߮veA\3bLsPd#_XH 4Sޫނ`FLGj>Aw;9hT]E rI4ӱI1Mr};.Go.e\1d'bA;LNNr6/l=΁Dʁě;yuѝa02֍s3ĺS?oθ4$Ѕq _ZgSGiX)"h+M8 P*y`PGiEA^G)#;!]74MEpdy4/ w 1x 9bln&hxivoC-E-aMᚌ$m[5YsL fN@`nV=R/?q)t@R˩B<f101E,(wdsc\x#sڱFƓ[v@R}O:gǧ8.,kX} T!Sd8#h@G+~*ƭ|$úpN,ȅQK "7no¾/ozZ1vJѲe1PiX>:Mu(E|~̔}· kxe52HS4$p*b%fg\JEǀ1Hu,{Q&MeNZ!c`*e2F*G?!Qe -\].Ⅻ''}Lx킋Iʸg)꓊AR3-GG+ʈINl *j>r,l~CF.IH&;L [0DL#WPuzy5mM`{CJ `|fbʣ GcK TなؾJ2F룜R`j 8Kpr\tВq{NT … "g]%_VN0xc@gD6$.:-ryzuxzvS(YՍg mw?ɘThgT'g8.8=3j Dso 5{oqf{P~:<] @:ɧ/Pn-1PvQ)Ж+ax/·U9Qi(u#l@aTy8>`1Jpɏ7Ua-"LM[/UnN^ϙN1ECv5 Q괨`uHScE|'`LJ# 4yNͷVA)D{C5Lp 5tV𛺙fr|ORz<$8Kѝ5>=ow=Jo#A{Zz ފXETk+bkh%iϬг]U ci>c*@:փ5Ri$X:Z# :,~|v|\MFg\Xߛuxnbk2Y4qf_-jCBz=_ӹ8x99_,uEOf ZbÉvp ױmté.J' 0M8JE fIiOzDSkƩ$I(л&)K,J[lTіR"DiD+AߥoCg@ jEAŐdfCFʵ_A$`ʞ2iEa M,1z{n[M2|XIKC Q>=1Dwv[Խ22_ڠP3ǩO@KT{ToG\GؠPj=#xzAW)-B DT~GvM?]5¶ ?sȁxICQ?ڪ Y &-A}U.;!V9V :孁~@łWMs^j q0Yc^7e'99HA 9t1j s+-+ƶ kF2)ރs,~.uVUijT ~y4[5n^vgb%B=sl 8::S a?'s9$5S &sX_:"sE["ZF|s/}D DAe- 9 $_SSj)=seMtp7]?;W\Gm僚]NjKwi6Wj'~*^>dG HN!Qq+(ǟ'¨YR42Zf 3ƵsADe*rcTf|q(ڽNC!K]x:f"TVS Rr[2c/)XkVGI)AG"F᥵ɧ5}h INO jB"Bl#pvyags3{p鲗eF]\jA RؐhKɲ!@\Xۢȑ=NTNT+c#Ob< zꌐ;F8b1!Wqrgd8Jlަ/ߛKvtA<+ $+=| !لGr?Hk}$ 6WyӨ.wPpڅF:/pAV)} EUԜEEZI} i-sЊ?[xs$)n9l ?ųO, T6HQr C "@(%9/HQ@'nbXD@\B eBpcA#VOaрrM jDB Mfb@QhupA܍m{7, cs(A921hAF_\ӖvYM pM>.eA~—}D||[TDЇO_x!L/c'Ć߿_] lXd2~}{a?7;5_4߹]tjwV$VJȅdonժ4D@A^ B_ tcaж2rr#+㦑!L~Ɠ^c%/.ހAӗ.e4YsY.E'C]WnrKFг0+ʵP)@yckYBL"VOd$P3 $BCc˽2; fa6Ogg|.s,g.qٔ?圃ŵ&}/J}~T*a/+#H<2{2,B$27ڐ+bF!1 YiN V̎%3řz/j 0:) s` (2 .тLfuV̕:@@ä(TN!F3]9!m+tD6X,&,ܤGSi dPVdI95!Syzc]{&ZK-v8E2AM䀕T"Jzb[naɶo8H5Q,xtgv\JC#]BK"ލtStEV2P[5(N"@#9itJ͆#fXeu,5( t4)׷iSgf5M4~zۇiA%9R"53|X3e-ɪ:J ŀSe0Hee_7M@B]O!_ PFIk\8}ŋ4oQt/v2VE5srAыke /F>N+[ {U/|g?r_xD3ῌN]:|Oyŋ,qO5OFb5 lfYi v:Z <WE:/[˻VT\v'kT 3&Kб=p[eL'ǰKȻ nmۉ{xCC,t)Gؤm$!Q8&8"ҽ&pرr3j T2!8%y=R~67f rjYSLRXZFor , s#;&Ѭ0QRaN;jJa}Mbtg(n'|"uM8 [ <[5j&YBNʼ =sLkqm7w~l3Nocп y)Gg 1K?.Zog|~`J*aG}YӾJ3.6KĎURrmW VI&e}AQVp^D\v𿷋-jַ:W ]䷋74}/bXR^<5W//+Zz')ʝXFJw4ּުud).]f8DHK)KZa2;r)ک2.h{QYp,Bd2TTi;nf4 3+zA)K}6h:i5߃uh-| q%$fJ0Cw!kX5BBFp#H] < ~_f?g;eY<c@n$!P+c{6A*&ي߸KvVMT k/E ^^ЊPhJd pgQ".鏇"~vr i{4mWX 5( { ! $+LK b#vԂsZq)2ecSKlXQz&#U)&{_WPc |s=Mvf!)h8jCB#Tz<j86ߍ΋GT@e&X1kB-`~#>β*_YcMXCh_G_Kl(E_$Dwb6XU*ldXiÍe=nE5KDJ.sV(w40.VRV{5Y\_׊F3F0̤NҗUu\Q$k$X,V$JoT^V, v~vJjXih]r(qVlVL naip ɜ* s2rj0:iIº[;eNN|ZpAYuݲ<3Fhׅ×g;GRA=q_ǹ\:f1ws|ߕ so G1wTg +49qn- Sdo3}9]:meUO|h[wU7on}Ml˱Wj,I<ֹZ'jn LKkyF>?=nw%Y.}xur)?gLV->9| n; `re|sRu;<)f)| AJ,fcWhƼ!00Zp4+C #!i 9l !*d7;mSr\Cc6d̰aUZz$BhG[.RKzW&@ڝE8yѧ1m}ձqJDxRB"MιYE.^.h`)Of7~x')1VZ{syq̻y=1ѡ{IwϬŸ=0s@CނT5Fq8M8 uNn-hT:Ѵ ìd vFBfB9Ù#ZV𻀑,Μ>\+HKpw]ߚ?fEV֏s_`!xzd1.xÏxEJo~\y$x`Z5@&7g햎@^Ax?-C>Bw컅! =*ڀNXR>}{t(zJT*˦L+1K+^ʵD2 =Tx~WXZ+:ʩĴMNW,%rU y"zL1Pz;RBb#:cn=Zi )֝vKhvkBB.SzULAnU-2U$ ?!$;*_9g NWsCm. #tqX}g[Q*[unzסkp*qR:'J2)l&x\UH q47p#ENaIT[buh`i>~)ibx%dw}ԷLhV+ުjE*BF"bG{U'D[A9=j` jv]('-pe׳VMI R)LOB{})?4--@ Kpv(:C|,uY rh6JK,_4z6ʇ7%'@bPߧ`3<߂5?{^_O`zEH=>g>Xw?{T6DbDd*KR|~&Rx{qK@ܨowDr./oZÔPBqwWUc/3h2s-2Zw*;g4hRrMfMG;[ ]֘bjsrLL{cnQ" \޳w&D.Zoe28խ^ƚ<݇zIJg\)=&KLBzxg2#pJ0;9y$sWuUZai7]1$׎ 4fZ֨gX[hFO[xbKMQ"luARbF80cs&s2 (3ejhZQDR0B7Pm` 3QmʬȌOYڠSE(ȄT)8 r@ULdf,.fJFa3BMV`ӹ͙,kX)g /񛶖V\w%9B+S13aA9c[TmW, \( [)@\9C 8TFAH$Ъ1k F) 7/P #`=,I2))`܂eۃ3A57k>؃d|ۭ,onx[>z޽Lg18z#"OLXZgLJ I6<Lt,ܞݽww7'cJ*uX!0~a0Xk b 2 -`Jy)-')]_'I#Lh 7]$4$ 1ԭX(={bkeZPʏP(/Ih uThc!rf;]qV("g-R"+JDK")TQHٶ5m$֠!\f%ɧS1Ƥ aޒMV(͛ɺ K{n$ђٗo|Ǻ%2G)*hlkwTf^؎^N'oo@sB&+TƫV4Hĕ a\g& ܎ X4vp"IwAY/r4ȃɯ`STRA C4먴I?rrV'ƔRÚT2jn>=*1pС8y&=ڍ|8R4w2 evfq&D8}o =Z|§e$o~kc'O<<ɐ}IJ}h_2, ˉ"' (%V"Sn( +m N4e$7jrD W2DA g ruf8SaH+C6lHz?~ĻQ@ock&;+zWesn>mgAv9.<\kO/n_݊z8vaIFS kquRiuф4rgĊiI \|q Z1N`!!9?7(/XFicUV8w͍Heoo2_\[3Le6_5Gl)!.$ 9^\r;`2%(&ځC8LR食[NX[Tbi 3^=+{ښ(-U0tdܳXsaD}2eJ}lу1T۱tb̐;xq5%3*UP`/{F(qָ T<ZWWإ |~;9Zꚛ8s3Q* K_I'vYyVoO8!s]VJg=#ѳ$wDѶ"e9eR0%IF(!D3j Z9%n=w D |8;nw e r,'I*iJ"  gʖW4@s6cދvS0N2\`R_9x;5YL94#j&pN_@2G t9E̊H6l/'n40pKFlJI+w$w mL S[8F8||K6N'&p@sM?WQK j*܍OJa < x[VGkkXJMh_e_\x,}}_g#)K~C}U3bL`W_EZ!a`+OPPJ x *bφLJ^Z:SRR9@0S8OyZ0I&5ss<#a07Q84 afyi!\1Plodfm Ph}>I;T\M}~Y$vmjkO`n$*|otJBG{ʀu7+]{m^t4cAn_g?\Mjaq>=^¤wMo;s|exK` m{!1ܰ/I6%SA1W//XP&Iڊ0ʌgrkБ(8ǎ[b])fқVYIS]^[|VYz{Sݻ<&Sݨ&s ut]"RynHfo* 4 Y΄<& ɊTE8ČV03Ff)J0J'ۄ76USTbH;WnUIJ3 VF(H\JY NPatNQ\J]4C<,9FT 2/0`aEd2˔"O !)4DcHh)jβ(\! Sfڏ=/ + P!A 3 $GZ,).JYf{`)GNsSB=2'ap HՙЖn6r` ]nB mg0v߹ˍ+1%XL*5-!nG\}rkCj< x=)&RW9 Ö֢DZ\XCEL}lXΠj?f^NFS {n.=DDLH&?eP29V/#r4\%@N3|||׶S6*I@VJ$du(oVWhunm.:[f۠q{,ӛW6)|jnz0^;Ѩoδo3cM3_v/ EʾX-^w=jܟ~x5S(1C,ѺL;B /PdAmQڐKrw1ڗc?j1Ob&*)赃uUaHhEm*!PBh3G# ( 3=/~[MPjMP1 ޚ 6 IˠL8 UJ@VX%.ZB %>͊M2{(VZ=nI^d٬'$)eoG!̶b^sfcTWMw)-v ɮ5wìA.VŦbnNWn;H,*Ewɏ~уpF0'o`is6^Џ=!>RH,Q SQ $ xJf%u鐃 L(r_yԍGVIʛN! N^B17[ xS()ɔ?x)1l`b!d_oJ']V#n<0)o!h1]ύ5Iz T6IOjUhǪ=![ " `sRde@"i1(ǩYH&SS4χ 7ɻhbu`f -ǕMیӦ|dIW^e4/.BI{`j#M.[Ӆ OZHy#FҰؑL<@Rړ4`8Rˈ)ɲj':'ar͖fLfKZLy;&v>n"7 N0T4uiԇ`H09B=gi00! ɵHRNIfIUder'7ƈ6 y;> G 0,5,FŐΪVpp%iplH`x11d8VFZ z#,5]遖@` ĀP@_bT@T)gXȮ9 #4K!.]AE YqKKY6R)~t" T.]Pk!`,EP@9֕0YmpPdC å+J )-@:]:]n1\nxr@6^j#BIO)5W˾5o vm*CvT K5\/R>^2Ph (sSfII TRn05,4㢀4xU:33ms[;}T@|˘5P-lh'e$i $A TJl-VگWc#":ƱƂ I" pLY,ҡ#O$id=|Ĝ尪WzDuU5k7!H.=gԨGQB#I*b;!ǫ2^%T݇!WFqquaG !vʺvxB[^u Ff?3(Aϣ[R$ra7 1oStZZzqu`C^L +[BB怼PmU8^Ç5R1xo@)0 EŴ qP.eU[qyZqsP\ud(UyucB<"܊܋q6|["p+/Fb"Y*t/^ReJ/L2t{I/HʖM_ΫE4zlpDjUk_hGo?Qn|\>l1lSo> # 'tr:<=~s0Nosm( ~wW^gF$LyHe*%ӊ~+x4͕ QUGފxHe[ }ozO~48\o wYN [yh~t1.T\ S/RlB )lSKZbRMnEd#dȸD%HpUmh~KDj㘵dtdTԽ~3]W׉wd0IJHJ0ֲ:UITX VgO6>\V\`#3 K|˜tOՂ0HrgZ9*;z㧙3+73sySxy?⣾(ܣD)ԴA<*rظmޡ$t_ż<_T4YwɪzudEf~:hM55]<0\7V&I]]!Z4r^[թ:ع1(bu}s"~q\hkP*u-iWg7ޜ_VnoqIL /H& ¢=zBt*B\QE'tu \Hi\1ɠ1i\pjM Ʌ1 9I]fJ-NҸD*Ӹ H1 sav.®}ƺ Fٽ_R.\ǐ5<_>Χi@a~y*,X}C^㿾]Hv6Cy~%ނ4xpׇ] 0a&o}ml"p8}OqCOwl?;V?/8(HR檓#o87nf1OE7: !R#ψf)ef5X<߄'=`OmHM#}n qD-|K#W@pLay):xP1ڌ I)^9}AG?(@Y`rO`u|n38!oJRO&G̃u`,$H8i戶9 w A2~s{LSbAH5x5#pe$U{X~+/M܉/͙",cB&wK%"jLcKEZR"Le+ .+þмz-F!v,2/'#jlZ2f7?a$ӒBS&SQ6^<GO3?@ AP%ŕQ#TT9ʷS$1W3CӞ{t([̡NQU*" ͠I1R؍;e5f-!1@y˪0/0H \r)_( 郾P+ipT|= 5kx*^gaD6bGuZQ386EyBPRG;(D['Nߍ?v;5XU@ǹ.%C ,%qhS}b/F~f*Wv~N Mh팞Rq㜭{$I-#iOK'mg!k2VN^֞8]7>O !FCÜm|jCg¡E5uZ{*0g= LbRE0v6ɔ8oU*I(9(ȆMq\OK@c,>[΅cv%HMn{"SYAAM|h0*3IbRQjc<3 H!n̲y*#E `zehQeAt w Fx.#2eSSr8^j a)sH0،xl&h0#%㒲BAD:R)vk#1@pDYadԢLP*RFf[ ^s/%ğl]ʵ;?|?W*ed7n(,W;~ʥwI&9AJsRD`wvdK}J9€nnB,#VefX\[u`WG?Jp;>яv[AgF=nǔ`p~+v8(Gp6G*K|J?.#BaDo>AwVh# Dpwۧtz;խooW>bXZI b=/JBu DǰWgraon'O,\/_"{|aqrWdPW_(lk @5Iy0XT8vɔ{zJqL7A:, ٷ`1yd+ 8ňz`š3j% .6LlMI@*HO_ORUYԇcߗbT^Qnx'x㰞"I9jy6GcdіqY 9ƪB)B[I籖XK%mU( [P𤱖ml;߿v|C {2=z,u[ϛ]搰N`09:Ϋ+OboVp(=[]ЉGdQy0luA'=I/@z.v1Y:]3 BNC/x)sK*a\P!إ&C44ge"<85 J=4K9%$TRǰAZsč %Rt93%wkh=UI_zlJMzݦO]:$ 揳^6?ViuG@ӻtb!?gfv8ttF,!s# $-QK wc?[| ټ54p ^$Y\-w߯ v8 Y "ן?"-6ŞYSXjR9*yiȅh);ܶu;gne1Q(bPn4ֺU!:Y+B5F9VM5Všj+ WS]MrB$S0tf'{Yƴ覛1Æ\cRp&!` V΄HlR%U+t9~%F۳EDsQVl^Y8((s*"oBm1g"#$qnuT%>UO٠`㕌IeIι)hTϘ -fCoWgTކD9y }g)o.K'í 5osIK Y8'~TJ^M)V^a!I: '}22ɯ PFvC1́l TG)<䉀>=Yn1hY3DٗMC|C~Aۃi\&QwuH_1p)?@vg_]AI8gm' %nɒVQ0jW,Va+%PI̦ xx [ %8e}|ijү ܃t:Z<6jʳqqD􅁾>݄(o_7Qy8s{%e(N`ܵPlwųEt 8_":γ30T퐝a::'T+ /3ɨU7kp3k0W&z":x6|C(@> ː3 ݘ?v\"x'{~ ̷ǒR:x[9,.$ ԩn3rK:#3$42; 9'ގhP#+=r]rc%)D~SnvvB+.%pօR\M~{J.v2 _/kӘx0?dC6?dCt\jgLi_x ZI;|9~G|//ef]^x [Bٲv5s)ng\ރWX>(l!vaе 7!Ӓ*,"iIK(zs ԵZΜ{,Xd%ciKc[]Ӱ0}d'z-"ޤ d1uR#%LA"Dlj g ѓ at7GywFTBX0dw|FrݲpˈAˆBoDrirD洵s *2},g Queܝ/ejV8 W, X~5v};|TBjB&5 Nؘ.޳f "ݷ򼯚ۓڕ6xj>"@׵YNcbhtK]v2-vd{|zbr䞆'gi1WK\-Q㔌)!`)2HO熾V߽_>an="O5 O}_~)~MauK >0I*\IА's'Ot\MaEtm$VG H3X]ɖN ԅ^)CpEn.@Xq$VzQR)c(mLY&SXغ(rOnTQV*esH*p@3]9oY\ i&5WtD$UuV#lkTd)N\wB#B98U6ΈKQ-қ9sVVk:"q!;yX"%y^ޑOƁ(Y:bu2"D$hDZ:oxz UP,(7Q'ZTD M\1Q1gAJ`0GVQ2xw֜"#X~ڝ5˩✹mǜJҎԳZd`Ze)sr>>NlT2<ʙ]@KjN9sk1ɼN'Im/ P. ]EKr޿ۛՍNkl"} )-Y1QDE?S|RndVM|h r9s佖%hۜ˅ R%Dx2`^pi"ъ *r4nj< a:؆MC肓պқN#%M@5>)`-[H|GX<):*H ikr4.m. ?xNd5mLε I։R&=WDGsS5$ڕȚxf_U}+u\Ĝg׾-1߉"Hq1njP SA'ڵusIDYm6 F#F 'IH rgRS5QaL6LuxjJy:-!dgͤN[qw 3W2,-|يU+ΰ"v ?/Ď`3Ix:C@IVn=DYlg#u$ I>o@B27NvЫM2ymn̝cSYi~|cry P@KR&5󫂕)5: 6t$e]bZgQ]*PId YhMun@86f5Ԅ dPTbӨiLؼ+M,f6SzOlnhvCpͱ\8 NhM8. ˦?\\7Q.WqEŲ溵[ʒ;_*igK%l Z7.]Z;1YVs֡v1]xh2,LZ*K]V)@)"xr~Iݺ 2#X~{o۫7V-(H!ƥMrA[a ` [*n$o~D]Vxn5Z ?/yu}m WD HGrWѰ.fBa6myi ݥd}W8VrJx4 n 9ҟr@k5mE|)ܽQ pDQZ6ua+?5IR@҆&6h9jN=fro1ODI4IoH b׺!iVF+53̆X"o?~)Wmjn+&71fGU+r1J7jhZ]fc콙`H0]~=`hn{İAopnxO?"\ۛ\6`[yH}9Ytp`Ο~,>ЬA%5+yWU<3P%n J۾+D)3bc_96Ir;uy~` Ԫ>?~dY#g[gq KAX]k}LT|~!,~Cs{lA)hKARlO>Z=`;2w56>oK.ˋ7KO?lbuzv@p3+p0{S> ߒQ(bpnm@oz%:ēMO# (HZY}Μ G)lov1I1IW2H tDY4Rݾ3,m0wQYN4bOQqM`[H]H]aDZױξ_.nn|ZfEjr2)RP)]将E`TKM7~o]_we+yû[ⅅGPFkes^6l)v Z&"#٠X3XIk<Z(;Q|"S׵f*(b-~m!Jdrlk=f8ӮSGV5 $})n%CT pGR]@>1Pݰ*Z%-5-|КeO*z %xT2uG^1k]ki ZCxGI[mcbG*tՁ2=6BjWw:R5UT\QX;ˆŜb=kmٛKVIܠZRc8r5* yOQI (%(!%͟5%lab=Vp^T 1=&6C_5l&%1 Q^-:W;v d@@ߑ}UՋzfU}M[0﷩+; пi0\ٳ)6RY L`Fߓ>IhxDӅ=u%VGpY7­8_/?f/_~GYW“0l bOUދM\gMl]j wLIoĘI%:ct FRTvx36`^lL{v5m6vm_cbG;Ɋ./8{f_]vr_Y*YmLQHXhvх7o%K$l(Ḿ)^)gNll ؿc OB?Pڼ|c`!qə'lWFD?Α1%$Zt{!j)O|/l+ lثKܖ#YD <9S1 L64uPwgvÆgbyuPj\ =\6$c!U6[цFrt Z[q\1`԰޳79=|*P"M6*3$† $6l8_fQkT: )' agPzS\o5tW&V%5X".,xEbVwc![Y?VdNy HNnon݁p4_~NUjw/܇fsy#NNP|,'=O`v|8f¡tƒ_ZYd¢yV;=^d tќd^^yB9t݋ܡL!bF7C`Kyx9nc:qBo.Oқs9?sƷG|srvuQj3 gwH ĖoYWCߦe+]y NA ۞q)fs8ۥc|nrF'Z0 Œ'0o5Uh΋G+Ӳj Ƈ%u:ƿEjДg>1)0+k?QAv`uwW:./WkԠ4i]z ӮӻȡKhijԆ%V&(jFGlwKw# AWg68s k`D@߂qpϕSLd8 ֎'gw.ydn6S\MW ПF_&iܓDkXzBm0SU-"ϓb7<գQxdC={Z9J)xұ\ +Yˢ#|Dz<:.tӖܤ?+G- 8oOmۙ翂SEͲ_z?hw%{YFó)z(sOA%DY^﷕h&FewB{eK$/8$Ziz֬UV9u<O ,q+0s3"vN Tii- U`hT25FczZ: U;Ki)O,9N"HHQcƲĻk>ur/w|MhJ39qXϥ[pւ19v^b:7Oڟyh'`Glk dP0ziʟ X_sC0ř&ϤG"3M9jc?^%m8a"r .u~\Hrwz$96`z,$eO|_9l C%Hf[ lF0: $SOtOJ^2g qB.sh#Fu0abl6Y74#LtŪ\z-Ί?#٘wRKGP=-0%f9A#7 J+)uHU Tj6JfJhjK2Scz,jNSKM,s[}qOgї:ov7m ֻ[#9ܟpx"L5Fx[ >l)/[ $=w5x>͐S5\-LrLF'Y"Z-oߏkN+O wh IٺKQnMAU{a j) Im9u  % kv~0H^{Ss}lja6Fr|!+BVLnUZY^aR2rZ: ?i8Z-3#΁vTq!c' V2N**IB9J5i"4j)ǴW:nDf<fgX)U+9MxMYm6gG"{αJ( jlZSu>kb ᮞDP'?uwQ;ɭ0A!AfV>y'@V)q*)coAY",5V^No"cpF%>`0gCI@mykLxO7JNV-[='eEOs!U.֊pu}jdh )7R0Lš7*^5s*=dIU1 xsn;&rG9 $ɤ nϒn&Ex%S [ 'pqkiBG"e~(R&DZpR8c87`% 8NM˨[1PM u1 cJ\7rȘ./רvI9^bLrjB3MDk R3;!AY,g$0) lZif3ddO𘇹.֌o]zs5JȘ+~At)Tθ3`\{΁9Aync*mJB ҏ:(-=ABxSV/8mpʦ-SD+JPIH%E<@' F]U@Z/p*c.#JwӘ)F0't=k̜W>>߷ D I,=zWR=^:l*i&sC2θ^j i}~ k.hYŃR`zr xB1gV`CEpOKI#dQ-2H>IqEJGD pC,@`KмФ ܹ!/_MMʚ,`.a[ `:'P3i>y@tцb&ҕ#_^@H(R[i _urEa1íFX.]ƿ4* y|idbתDckaJD[5)D)1`lmPe Da:mvvwtŽ)HJA2S(l39A7^|8;gӉ9_<Η>k`k=Qln2*0&3L V9f>em'f%m32"mgeg{?8tk'x><:+a.O\ M<ٯgmCs(fZXOBT9ycj4Sh.avx J$T0;F"}%{P8hL$U"R 22|Y [B;c5yJ.5MՈ3nu)0 * j\7'Aa@0gۓPHObyT*E $# c"$VQ*603Ns"  cJC!rԣp7T0@Ĥb0;9bLzc:^u{f7 Byex4I~@lr^o,]M׬Ђ|6nߜN,Ԥ0q, m,YbT.b!:,d Aj$4.u38۴K|371xE[=#_-9BB]@}_]Z?e4`Y;=mӟ^n#ϗNo Ep WӃ4ATIőͿbHQ0dK:Z?Ԭj}C%qz.v{H:};m'D]_dm&C+au8ʰr?"'gEJ܂j:J `0mG g@ r `p&L8j P,NwTb(j{fxOi\Oo KZXD3g3_Lufv8٭N|h]a$GuPS lkHvNzG:9^Š|N5%ﺰGjQgףeTmS^VnZ֌em\S~1⧢u[ٛz@=*>hZo͋ 3_z1naU6x3o+I +HǮܢeґ|*F(GR[7jкA}Gv@KqQNзuk_K,Ѻu!߹)ޣ_n9V)Mwi YE[hM)r2FTi/izg0{n@#PAhw4~r_Y%Ж\|kC>h~u R^xw4=+@ةӠ2=FBQ/!)\Ƥ$Ng9:c[5a^"9<2 Θ&"& K˵3{)#ta/v`s2(#:34XV84|'$90qQ Hޒ'71x;\܅iqx*>d|Yzfq#Bw2B3ļG,rѧ*Nm]*A 'fO/zKIIfdt(#ޯFeB=zT%]9&vXX"fhtTt`?0p¼pcaUJZKO7 2ۺu$K ^ O V:NI \ShdZ+ P2!`.Q5pM{??xYk1^_HmЉ lQx?sY0#dsů?59\c!Gڧ -OV*t~r;m@a)p<1,kGo;4j>\\WX*KI+P8( M;Ly^}yҜ T@KOju{ \J_mR6`m!p޷f5Yh/ϙ"U7~N!8ow`xΕ~&dOٰ̜|Ǎl9& Xm!?Z;@~VVۻw(Maoc_eW[+H]&H"DaÐ:1?S ׼qќ0Mʘ"fg2) { m ̶f%zSۈa3 ݤw7p]b?nOf$av'{+p;Ã+@MTO,S6={c' '`TntɲT>=7=YAH km+ISvWWUw/0/8O&Hv7YJ=c =u(yLQFRag:l~u9՗lKƀZij滢2E۷TZFEXR"H41h,(- ٥;UVxtdPRPFB` 83 O aT* lpy;wP*zMorq]ocTQR?Lg>{]AQTY>>+v.H׵?_W/ngN]Jg͏vv֝*cm֭vc\QK.89Wk]Ds;=hN^jYs0}0V%ߋ:iʘPћ*DL'oQ~A3Ī'{78p3`-\9LX̄e*)V̖V&>ymY^~a km_OLF *Yw砈³w{',C mVςcw#mxeHs06p2!,9t/=ʒ6[9\p&c*R5RU_W{zw|yFJg쉲 Tn yki(vN퍭Q5dGM9^o 6){oY)5y !Y)mmzpoMYO܊aZ`E 1{&ΊKwe RWG$6a.UAsŔc5^A*Ra,ۿUr IuMUMR!W!S%g䪥H7]:VDbfRHfJCqU8#dOA~5N)}-!p-mtvs@UBʸiUAUF ("dbΖdܐSP2IPql,G@Ym @Qsr+N\ A1Fc%[O'H^lvJ嚪Sc&ަt8ZO#$KEynt6GnSu:ScHt;b2#6e6Y6.:ɧ<|y'iQq6U-37D#"ĦG!KGɂF9<wM)Ss+mٷq}^/_ǫ]w/$8^t _;ෟ;hF6C/Q6TJ5:;;Lp2&ݖrQ>Q,,u[l(vѥaOP 3-m֟ܤ] t1e@ݣ}~3[?7[\woξ}33 /m`K_>߽4i,bt}AV$\_/>͍-?]dkjIdq6BBnndQAr{|_dsF7+Š[uѹ~|b\̎ '^߆vr|iYl>[F. |]ŲhgD(9L߽UIRmul65!yku.D?ШG=z<%&mlqh8W8+ *L㠮{-)Ѹ}<~E4U~Jĕ7n-yo,.c p5eQÓ*7 1pR\/uMM[hiz^,R,}?/ːʊAq3 %E&K7Iŧ7R(6vߢrͧ_,**H])nS AtN<\sQ܅UǁAX6o!+E} 4JmԠyj%ߴkRz/%xD&;K܎!CʬaF@sݿ19ȱM.K.ڡs }6*HJH9hܢX)qyxu!7 P*Z /_MȠZ"Ge4?/aQqNP&"j=A+$06s?v bκuۥvsX%FF.Z΂eCar>S*Kbw&!مT/d9X^[&vL^(˘oayTt,vjMt-PѢ_fޢYȪ4vdi"WB1[~[FCx뭚YG|o-e`DoܻdzNbG>/QDzڑx;^+eHw$ޖ?.ދLPA>rw2Fۮ`F'9ykԜjRbJPRչ>[F@g:eHYVU6 #( Ę %@+m\-kS]45W#Cse#3)*r5Y獧d%}iM*+I̠l(ODFӟ?{dzH'&?pe=<􎖾:+ QfWgb#PǷ爟r$^L嗗]/׽ڴ~Ϝh1ݿ>?ƠW|񁡹覀k30 (YKziqܴJݺQn-=qp}pn2ƒa)k7Y90-c|j3PW /R=޽?;7!*<>g:djеF=]z;^ en @J2UU}mvH`O/峠TX HVkY2fJMnl'/2nzguQ2#Y{(Z#;g䡛MVq^cZpdۦ[? %wb/xćn!>6 5S=iIir3<&Ef6Sp]\b XUb6N&Z =12dU8W;Tu@*!&@Dj%څ^}4Gj89!n"hy[vlAeH+#l2PɚACw%S!h,w.4!^Cm;mgUJbPUV1ɻs!cɸE@cO[рvJ `6L|O Ph6x}K7*V,{;23Mڃs >\ 2̶px.9Af]ƮNQk ۮ8=k=\ \wMM[ZpSnpk 5YjYáŠ*Ùm*avO;)iZrZA0 0 ր(tc^TK^+yt}rQ92"ky*vCcM=xDC~ژκM!}Nfcz~ǘmlWh8 Nlx"3hwtHNݺ tCq1G-jtrT îhƣޥ'--+<^1gݛ5EQA.łeyy%|kXۥc4}aÆްޞnC%w4(υSB=3~ 3(xl OQv9+)Nx  >ق9Ԝ,\“>q &B.6xP1&F Nl˥fl7 _nì0j(T'>a1sZ#(NkHTxڷ.wU1Dg=^74@Q1S>bOpAg5R$w$:yԟc߮.B |}ԟ/ŧ2דLzAj?UϾ?9ű\Σ>3 S Qϳ˫dz/w}g4oo?{_5nwyS9Yϛ{c'ϯ3<x~.cO kѰxV*yЯ9}!m}jtFZ)ݏSG}Wȍ^өMWm?t޵,q+_qx5W&w9ZNL)K(d_O̿OI&t{I'HNXk"FkKuBsڪ|D<{b0!'XD %hG-ۦ GBW>@N?~T8 jqI@F8q>O~2E@fႢ Dh]&=/孢OwQ@iq }K|cx6ٯI<(]UIxYR;!9S{әZ,cř#@8/@kJ~V/b#!^ڗ#p"p<{^A&WmJ(jwW(WNxo 5zD荳&Ce[q†PQEYL#ϙ 5ipv]hRhK! "Ip& {#Z)cgYM i\lޱ}H;dxf\<3X䙂9>R؝׺bEm/(qt#Xb+5qsqt7{4]g 43'Xi6vW8؂N p~eg?;WJAĢSWtB2/ɼ4ʷ0V>sIs ;~[NNFe[FCy0w_ ҀMiQpN[3!@FCyx&|) )o8( AIs/+2,[i9CsO o9uW6R=\.8,ԡJ}pZ͘ q{qS:@qkN/yÎGI%gSe$Onh]z}?]$WlGoh[~?'ؼ#׍S/nB<0lI,v$JWY9EgCoB|aoOg 6Q-GoWͣrqYf\p`maOg[o C^~!>)\䟪ܟ3IStmiN'֙;곀Tn&Ni Oͺd&) UFeαL.y~zsT/~߽wR1 kG_{C&y:5t|TU38p8MqN uB9gl`.M%cQ}ٕŹ>cZq{&G;/d˻ bѸLJ1XgȜ'C\NãaRST_zwh>ү2]HϚ;tdI\ܴ׌7ޣȭ Ib![(:!'HƩ,o0&GχH]O.1p 7Q}NO6=*EVEIPI ! @"AlD+>s(T#R4.rc$ta%qs=]]q>gTbZ֚.0y NnCJjcu[K  sSu.sθ?(A "H.FUtJEe^#Z&- F'G'*;,FtԆ܈ȿKb$#ƂIj  +BJi|w+FOӅ&k<ҁbS:\DmSv2t:Tr])!9 J8&w$$bA۬E Y 2]' ìxfJDڑZ7u%:Nt{}f]z+Fw(񷴛 k\UF4VcJ\%?J3Mî;fR. ^u og7݈2[S&!6Wz dcʼ22味4g&V&F8qV/NGrg}5#+49O60b3mKͦ[Qwb̄*(#yGSGsҏp{kI&1ѽ7#{ Ύ= 86=c9Ez>hyeuERM{%ߓ{oD)ə/[a79]=)7F͉ Tޡ;_$X!ëSi|T ]v`G2yֽ&SsnϴoMW2׽&;vc{jIsfvȗ?&k/o>M*?jry"j ٟk}!~cؖKz;kθӁ>_T>R&20 \qIao<0{;gfy-!h|t=9 9@iVM#*1:fP q+FN0:=֩cO8S0:st<:'ǧxTs!.f.#aZJW6ho7ugN=zHgv0Y!]) d ׯ Kݩћ ;ռ;Hxi1ߚkW"?`NЈɨ [NtY}J@U܊h >'Ɓuk8*}:dum4478pZC_|JpBh? ;feil;Ϳ|鿁kPy#\̇V ;z]D^">:>''pd-N "BW1 7Thݺ*+'fljw[q EBFjgښ؍_a%{Bj;TvN$/N0،%RKR>ǛII#Cb8Hl5h4q؉`FVQ,g11VP&2UVwaj[Tw, 2x 30X #%(DS)AǰywyƂ+3HUyP@ в˃ G U)HXB#%N- Xn02"LbDabAW8 x[)+je]/"G#,O({xk8_odSסSRh:P Ij@g:S Zn5HCaC:܁;$j& fjMS)C )FHLtX^ AQf*}vP5DOnj6(9JqXѫPYVj(?lsUNO.p1ReY5DX90E>oPijKK#rD:+d)I˙44m'#A8Lr@*-¸xE}[c2ޗMo_Uy:!tYr_]#T %t:85e8/hN (F]8—3}\crNa$B%~jՄ-N/Q®A,wiKt4Pez=P-Zw/QrןF^+c) :I FE40RJxAXbmT!ׄY`6Gf EU+l@LoeVK}EDͮМD2Ij+lxrILsE #Xۘ[#Ĺb0]J^VH/&(IfU } ֭g_Xfxl8{kRJ u:'@88X(5RK{?5BWb5C0^bXXkkbv7Kj jv(Ƌ-xN/AĂKs3ڂhlp$gtT}a:XT. +fE;B pE:䎐_q*TRluk&%>F{$.X ~o/Y l?Z[>h7q8f ?8,gJq+1$SOQ"?ET l<.{"^tT<y<[Ø(7k͟F@x54z[ 1q/Gb9`ŗe2ƃ-Ls^J..#?>x ?Х<Ǟ)f-L)"1rZ`m0NHIm$0LTLЂ& *j/~7u-jq+<ĸCcV֦*lC)O9oc!{ʊO]#csw˞wsֿ;LV"C4?$bI˷EՓ8~1s""; MTf1apbkSɮv13N2&!Z.!v{#kPc4Euu3M?GP"?_)FB>|kvI n ̓j}Q&w6޼ NG~o1b>{`1G'J9'8;([`mm0b&HL oQ]N(CLOv Uyg~;OW10* 1?D\K_ !]ģ. h$`=#|cSY2= Bi522(c炔{~xCvj9BʠK)+PR9p,@ L8%#Au'$FLTa!T@B Ρ7ô5x(T&0N8Jɿ;Ld/=Ӻ` TVd( ِ!T Cs,, k*QH`hAj.#ڑ!&ˤ:s1J!1jFb$y<ܲ@ں 6p |KCPR`bo;\ϷHHG\H]URx0L11X @D v``c45GQ0 TSB.(D`T `oP,ldiM 6a)V& 箢B/DqZTaR3'|˲H2" ֳ≹;_ 09V#ɐȀl ͦjZ订-o'_Y|0y `\3S`>B 'S)XRX,+"0ax9Rtgk~*?\wΘ? XS`Hx!9հ '<1̿?$.I ` YjAaJA"%G"ڙOA U&Q~ԽLI4)G(Ϻ>"_k5]{\ZI] ꎅӈQS\jH+ W-`F^ɥ5(^ѯk2PƸ7 4dB*ܜN!xASGSĚGuPT<Žח7&?)祒qxQMok'1[gwvuQTH{}Ԫ~7L"=yzswǸc(/wS?_~n LI)}rQݖZ]I07nz>- X~kf#k:Z!C89of樂Lvߙ?/l5]ٚ _a4d9vydyŌ~;XcssO6AKmՆҮDnӝ}>rUD5JYjDPvFFٵ.v~ui܌14`xs8ÍӉ2BlxAzk5&mY lq[nЫ^Rm+(`w\i֖/v\uҤɾ:ފ%VOwn2q#Hl5BC^(fդ?7!EbO[f#8{s!t39I+j?rM2f\| xH|ǴNP11ʘ7eq}|~x gmKW.uù_:3wtVJ^&ǀќ/ï.ZcsY c 4ctc5tcU*BWi5j;~zo\i"HƎo-^H.N!JyӦf=m}%+ަ;r2S'lwZ?~{u35"RkuO3kg* QM % 5rA*O#.%ç+ >В@Tj_jѫ0ZaJ|s9h--Vou~rǸ|@m|EizG/,zq0J䗉KC=P9{RN#{t%}<9T#4t8)i|:TtdTG2Rƌ0'7ˁ)\Vr`/+͢^Ѻoc!\R͓}ˣD?kܲrM-'s$j)=aX6HX)mb^p| H%r[K Dh@{EBC0jWnFP-$~7Ϯ8 Ng8+gAX%AIXWZIXW*H^6z|l&.mt MZa֠90eaxfB Bc{K`v#/.U#i#6*" ͝"ǟP8LAbbo'>\ĉV1/T7tIX^XÏ?Mbx2|CEк2;|9 J;G gX=T粬R c$y& 57 R1z=:|izS3V&ڄf̺#_d AY-T.'ZFr[`1:]j5@Y-Twcƭ/y`D0HlFgKe{Q,U3ewg_z^}"w7P^SYkG՗}ʧЂ xif0s$o0#Nksp1u S{PѮF)&c)9i̋=<#YAj/~~=XQB-#v<Zq㚶NW7pQ kS`^6׀3hrNr{%2ňFy]ת.ȨRS\qϯ'?~m ~2_V_f@8a "roPqT^5@QPpG|,/.Ff3.LQ])H~|6"(9Z\K,Fӝ_i!.֓{vb?r;2om ]Pc"f95-0A|)3kȩRZʩ45Kr aA2W .B (mϸ4vbZSzwO-MdFʿ9Qq ]ȪVv??>zP eg&78Gڛ;|0U7Y@b] |Bk۾SuTۣwkB8L|t2۞Sc40pF+~:Ŧ^ۛ)xDI99["Ioteݭ*mfl3P;FH CgIA`g-ݿ®?HTp0bWn/0JHS̘Pŏ  F>ߜ2få6l]V wz7>/,cE! gyܨe2sL02~8)Z[Q&8c4aRi.oS.8,388xל1#8\^6jbh~df/n37aiW?HTJl>utѢhgMN}$N + 4[=(&ZQ߾|n TV~$' >ީLYB'TSR + >¨ Jq.:l;^fem,ukc A]Fa1ĂO ƢJ86I'(0n/Hx=O)1?3aN8k̓R!|X I_ex30k Sb~hG>&Lrdb,?x wX.`56(E]UAf=Z\QL`Fa B)mߑ&&^gWZ'kl~$(\ a0]ah? Dk+ͥM=n˧WUC\jHé5:bQhhR8*-#Dhfβ*JR0zVˈr JTx!\d~Z Vtu:^sW9PUdB<eX$!W_R4ƪ|nL.Dce,n#/ԓb{,B#rbGNg~4B/Nߣޝݭm~15>3.㺌2:˪Zϳ_qN&_?S@y(2D7$Bmp.eê,Ѫ5'Xդ TfLLMS@c;z&3aCg і0VRQ'8PkMa/Akz&P!g]AF?!ZSә>5HVtjVMp+fx&7WM7bժiZZ5Mz8ut)FK5e ֝xxw@wѸv=8uKŊuqv{bNXwte#t[mïnciTCm9=@zD lܽS%rϲ;T0o#J~̵30îy)/hCD+KnTN+AD!%M26Ԡ<\hN %>.\&scEn\K3A}r!5O<ɘ)3i>|M[j*VP&!TC#.mD{ ¦ 鸳0_>4g,UswY5(&HqNa񭁨BmBXJO ޴h1Wo (BS`C_TIECވ_=ۮXw\ʶ˪A )9j)(:rϧyKB"G{ *tݐ{@Ri*;ٞblfԑؤ+ Jwͣw>RxRi#"r0lC)ӺSwkFw]Ri +'"ѪdNVMiV|L"UޢOVύ6rdVIRsc$ԤkB`_m kJIn7DDb0-ԣJsxi+#R [G`k̃°B8DSLޮm_-)ҕy "\ }USىw˧Wi ̊t1TvCpJ'Զ|wSjNZu yES8Z/HޠbC Q%vTɵЍ[5S%ڰ{xZ#z<ì9SC^kN i@;:mśsS):qmGd2֩im { K+RBz'ԙ@On{a5R^7֮r\|Z)dKדdHOӅ\?̼_8kYrY#L0W&ZGkTքD@ɝP?(Sf@$uL JJ_\Sn) ؆jRѴ?θ^ц(DF EYJBz1<Vl`S" L 0 -'#n}86_JB2/Kum@%/Om%Z|wH]FaGΨNC)F M<A3X鷞@D ՘l [&o:'@7jGfxk]0mwս8Dy5 ̞m3 "e TZL~4׺)-꫺oh@ Mz>;?[zn+{>_P5xrv{٢JN}$- V~$]~WX_"C-Ld\uN=iVxi^f%Iyf Ӌ 8iAa/[t¦ld['<}1ՋM:)}2%;ms)q[z YԐ̻1-昆`,ż g9qP9HP6iwJ#X*(J9…]<5MUu͹ُDXvb_XC菱n\Z=E z1Ao%5\Ne[ ^-DCEt:L4#.f#zF4k04]Z.xDzr[#f)zT?_ ( z=F(h.a~g?Rqr;i#}?P~qwx1bqwq~N^ U,NQ w7Z Z\عD' Y~[Meɼ?Ex=<;L[O &3.82I"hm/+mn8/$^pCU$)^S9DH lӳEb#~zU+b?bHwXp$0ߣ݋ ͽvuci*2">m+@0fx9k>m, Xn|f؋hF_\Lzu(W\nbf23Dx$TnGu}ٿbqsb5EE.Vcof)Fs-{^9W鸊NUt:NGS<tϙJӊQ,,(byby5C5˛ 0VmlJ׫- 2`UyHLӘ0W|P( )_, B‰g?i?%B!\FwV#O* a.GJM9Q9We2/Ճsʹ*>!YKj=+I/Y@5)Yt1o|a!F8%3Cr6,i҆60wb;ffE)( zSNj]cXm(pfT{#*6"~Y VS˵)z_QLFmɯ;1'wĎ'K_}.0'%»BB@sL Q\+ UYKv72NlI][(ge@4a#gR NX[(&ZbzKQfS-D葘'V{O3ֺi -4aaʵT ɸU\P 9bرՁnq7,:\#ślk]|Ҕ  פ)紹O0B~w_cLpt-UӨ"J s bY#TOL?p#-mq`Ɣ),┓ sJ sghL0yzc `҅VwG)2-RL8E05^ SEuNy!j BI{kX-D>B6 dqՉ |X(*W C 1"KCAFFUefy1ܔ#D6i|9{taT@0 n-nzfy̘[/9Ʌ9˧W9iqUpjaΒF s!etrS9.*9,1[`-5c/4Ta"m.̩=ZNٜOK)NAe Т +;SGL$XWJB/Uk34. u oY~Dp\N .>p Hl E^8.i@K2TV<cs'ڠ4pLoBRy7:=f;gXF D'7k & Mc ΁}]$ )AۋcLj]_%^, rwRw>Az){ O򵷗N!TGJ6i&sd[~*ر>05.q/,kLUd%kϮmIJ*v/&h|$f,nI,e(S-5+NhzPҕ= @Yjҁz)A L>$nn*\ZQyט@Z4Zx _cȌ!φd:)U^y ^@f1E ^WG&my)ͧfBD6[y3H(eLb6k>UnX4pLeMs__*__ ¥H>9V"`yht`.乑f$G>x ;WS@w*VPcT,Q uW[DV#C_6Yvf/s7fR 5yr5;Pߊ٤C<&FL'>I:M,BE跂*p?jҩPVI7k| ƪjZw*ۆ`zt 4:AjpUj slW bbןwy~u'[, ww$%# ߒob"x=W? o@ <>O䦳 4ǮD#UR>wswBeB8X]ęxZRnWL]q^A/*z2Ii-ߥt]/%@\y)<&܀)FK.$vz9_]tT> .p@TH掹3Ā~|mɥ~ !,_b{94||9@? Pތ׿_NzOv7HnlȲ%݆,u_`2ǥ${} [_~e[acД:Qj2F͌p\pVGo^x_Wf XP]:pdd: |kYjV{p~( t< 1>9g5bƥcTs`Qe# '.VH(v }%B:=QB*XL iCP'B;ٞ1Rh{Xʖ-;^*%t"Ǩ!*L`n8Mj/"ItwE59apJ} `dkZZ.6ujŽƒUvW`F>q}*Ez:÷?J_ҧA!Ky' E"b æ|!C[Ɲ=VHAtLD[& *g8Č\mε9Gty*<JF(E5mkc,E*(a8'QKqYa)rِޅ.EI;$EYK*똅[9b+0#),>ʼZP SˁR܈܆$ʙRZ-.D.t%Lc DKkA'Y"!yG*GR#I!:UaJ,ğZR}Vq–OsqmPkX6g19wzD=Cc@Z Kg]]t08 J G"DBТb!j[3!جdbXZ :UX*ـ#X(t$pfSA܊&~TN:R"< |թ-8LͅYo3;B+4IW1DYE*R+h{S;q5&"WAL,3 k `=ո(5Ǣg?_w7A( >F[U{q0VJ TDXpQr 3%;b 1N |@Ɯfh}bmB`+>wұdﮗb (k,ax)p@c$e.H1ueZI" h7x3{IXOXa|P6qke0%6F ')rs kľUp(ksSh")x ie%ZZIpyOj++LErt(Uk| ޴pztu<E`5~fxw+s?]%`a]qtzWGZ+/z'Xb{>0-8 o|̐7u9,9l1ÜӸR8զB[޻IW.Ɲ#9_Τ7e]6<@=)).rok(oгسBv:?统"'+NJtV+"j#oblnqQ)GXbdfiDFFFY5exٿba~"][8+vvF./Л,;t&tcݶ:!%Md%JN m[lp}1ԥ+ѢҼ6矶5:_:LJMQGx]`^=ZnLZ򷣒uG*О6J>@C"Е&Nyv PU#P_VޜZdqmzl)F(g[9^M 3)BDSX@`LgJb*e) Cm)dVGv ;ZMsq4;P@>ʹ a0٫m措!t8_)]:RV:L%\B>pbf4Ƃ l^⷇#` v$}޸*,` ^8Qj,ە/r>ϣPŮ8?͓\a|FXU_a,nȲl_SQ3ޝ?}0}K,C@޾A6EM(\V<-5Rs4G=-X H99 ̊ܗxJ:V<(z.ԃDaB${rHS!1(݈-dj*=5y븧j(VVܓ=fJ cXb32s[׌O&[>w!2{k=~nq` @ٷvQ7z$n<4Cj9\7wnFsT3}{Rk b?d"Ass(<;] ^ܖ2*#.M͋m]Rm]ek\-"W[]{=y@c9^<Β"@gmw4p:Ύ>}bM62_Z pj+^1d }koA{Z{m)~mkuzbۀf!q5V[EG=vwHr Ԃ1"GFP`^>Ç(L M3%3gѕ&i$C(8( $Pb!b,B^2^.'Ļy:ͻSVV!1žV\[YOjlmB;a M{765꫉#=97ph؃qC-8uGqVp uQnx|>/Ġ8/l礰qÈn%H39X+9?M~zc,-" ?T $ g }xrp܄8q:-UוCx% 5]Yzܿ\Mcr?6΅vB:v('s3/?ȖS)˵j^$*#a~x}GB2qiKheb/j}էLæXlsrZ1Rzeɘ R B3Cy݀F" 1y"vBcHGYlU;W+ q&3|wHsͼDz?a9V(~w\ggE;G=?G;__ï> V,❡dz I?X%z0䴟y=Vεm ?9z=tVHXG^Oz6X`yNtaY]3 P ZMNg8[.4ہ S`s N8e8[0B(kA8@0ZK-GD'sid" YR& 3YùEK`qE y c1%4Md.$"(Hm"1qBJBx1P F-i!Fd&|,x {WJWzHDգVm IJ(guۉ{7jq8W7b97@a-MFfP| Hu_gЬ1r~gB"Fقf$Og(@8SF3tue7;߻4+F01ѱgM#lJ]sb- ʫCF&mwI4}2(aIgi&2Le$504a1 YZv* K> U$R1ciPq$(Kf4#1ьqc.|sԑgXb(#S@%lz,TZB$ed Ĕ=sC5,ĵ S~Ba,R=Pyb' Dj=w⡹[L 2[ V$4ɍiBX _tDVp8=a3g(2ݼY$k e.1+1bD2qLQ#PUg v,gOpmzN5nZ]|o5rBZ"J'J 4|[]"N:j1A`,^/\g6rZc(L#N|5~_Ϸ?KWsKvAۜ(~u_2H؋㢻b/0gz{3KhZ~_k텣djS( pd@")xIx&c6Kd<4پF'l]Ewt!`H,ƣXqȰQraS rwWl48w=m(:yo5$|K~a.Wpt%HGRS~m|]?5!{a<8K~q.L\@D_.Ge)n9ZbɩҽFrv)-Soej-@䔖 [%HˎDaHw AH)vRZP͍W)D)NJ1ͣ0S;)ͩf꫔^R`'4R`'9\{\JK){65dۙ/,FYOUNw[&`lhw!/61DR266+%|]̬Y U( cr;(kMV3\wާj/^O'gML)Pk솑QF0>ŖDSRМ%3Vy=hr]riHbjMf -zܶ uտ"`p><Oj݀.Y8] w"UwqY YMۥm]Ϗ i_^I9/Qq<\0%I(aӈDp0Q\C+g%wAwNd0 F=.LykmSi0Jd9/鞅伶AtOխ 6:H%j`}^DҤx,M:]4Ҳ!up-0ߓu>x}TC?mM+VD7Z3\_θހkm"5M? UeSH|ݫjgCw!HgO p,$[%V2nX7$1ck$mI%nԫ<'5KRzK.E"*36'%/5iùߋMx_؀eo9,ǰ!OOXPEC MnXCV#و<+V*aDSY,T֧bVEd)^r`O6&X>k %Y(i,a#r?&Gm|=04 =&iC?Rކ1e\[^jK6F}wUJK(x$DŽO}gM#%gdzB2Z*baIM'pD c<J*0-(͉|+?wi*8r0SF{ ;kX k*j@|OuvweFvMD `3y#5>BFFx'!gî>rQX: ;цKB OМ+L ϹZ]F&%e o#[띉ЗfoMFU=,ZW\#tm [eZḐ⠐ck0O `XIoOۼdÍcVtP++3AתU]EuPo 8A2d1Uȏ,"1 4W8"8B_|ާA_;,UuN˦&- ,)Dž@cU+,MʃǏS`hM`4G7H YHZ#]kt5-Z#EvOq !<"IFZDaʄt63;W;W|wv֌Scy%61*e IHEMh VD2-7y UA@OU\ Oz⯕ASmUS :: 3Y60Ҙ8 u'1Bk-d2684!5Ej,%Mexcxe)'MSHN`l+uS[ymHV{'ȖfcEQrJ. QC}A"CKbi 9¦yUb%jNyU\ilߙX %"  ū7Mx>G Mכsq |\1zRO@}jZx.UzrJKs,dEͅ]K_oO&p+W''Bc9trl/ y@+µ `* d]k=|#aҜؗ8P1[NK@RԩT+v R¨lڋMS[`.g{ -P|\(?Htesj]8N]=p5yJPxBm^ܶr$ }]Ml+Ϯ {[N.0N1mǷ˘wu|k-"B;%׼FSF8pl ?ša}Æ @-)a؝ɨ Z[|xDxmެQo8+3T,Jȇ _ɼ|ˬǧ5|_i.'X~e/ $}:t9<& eN80FBnd8ԼPp<2-C> a&zQ+qvN87{F^0wgZL2 j'JrEws?+ڛ7](ԅ˗`cb j7\'N3<;=;Ǵ76L yxݻM%%X_ޝt<}?|,8 _S\JP~fп~?1i+yѹ2;ASj<S;3g|vށ>Nzͼd ]-Bs"yy-MF{ N⎤+P`nz|MGȍ(iw;gq;* /@M@fAU?Nѧ2cd&ĖP vhÓ9ďh E]Fhr۵]ښsۙerz35D+F*kx8Om׳1鷟7HG4;>2߬VvkU{}!jc'`Mғw+^#?h^D fm~|;}Do^/oF$t" e~ d`5}2E?s>M ߧL;x8͆)yR~ I(r8͎S?eث,%k[vLf/e|1 OO,߀?c?eIޗX;wy7i7hb:1=f!Zw%fʈ!~HX8uea9.W"Y("̾?Xu:* lkC->Yf.%r[,ڳvZ.YB Uv DlQa#-@32>԰\(le681g (GSR>X+Lpdr0""d7F xa3yEI9.^3J?37[(2Ƈjs3ؾ4c* ~4sgw'k=ݢ'8!21*1Q9HrMA>Vs*ChɤΉsT)qnW1RSYˍS3U,Bš/*P )n;QFIQK ":A A0| $'q?N{2lUl~C%IQ*ϪwG(mk$lj'F_a P~@)aĄFLād8ҠF=CuPXRD*O|R+RSQPFܖgwRX0Vc͠AK #9#1X0H0 GQmb>q ®5}Il3Ӭԏ]GM[i+b/_uum9( 7!7f .Ng;=!cs 3=^1~%Hui9Mf{5֞c.oN8mN$5AL~-@ʙj?ۄ!N1CstЗdk`2Fo;ѻSB0c$M#4[1`IGV"g4nߟIVj!Ƴ,^:X+ɺ]S<.'ժ3,`hf c/t9\L2O%|:6 8h:/wٳ xZyEq-濗2G}xUX .X*oPM^,jx œ'/+le [sYJ=<͝Q E(("4 Q8DI,8l<6"b~D t?NAl3p?2X(z`1ǎ[DiUHؼaq`O.V"ƴ S?`*0s, <ꏴL`JYD9?]өT.7q}jv:ha @Def+g~#i07gˉ=oعU&65L+^r3*D0佷3এ-~\jMR?xh.չ+nϹht[N7\ -|O,E-YhwY5kw*̪%(*'oE}m4nT\`)Œ՞wU̸CPj?{1EӾ юm$8P+J@g]Qk9r+CcIʤDx]63} *65C;VKK ˭TUcxw:z]r (/&(ٸ7Ȼ&i8!-O05GY̿0m>TaZA1:hhDH)A$R+!K(|byN<=Ā<5@ͤ@쨻&YOu+cƥ>V.Su2֗TiZUwnup7΢xHI\(s@V.Su2퐻ܛv~Q!8.wqT\⣸8PC};^渍J\d[7}؃;L޶"9:*Ԛ*ED;fR.%(Ԕ`-xe;h̉JwH@~"j bGDpC_IYH%)[>9diG궠To:STrl[R8ML}-&lfR`RgAZ+`Bu 2 IZAL9ș0FBw[L6@s^ѪJ \sgBF ›YIg-**u`R1WHV2RHRz|kyؤع}*H Ԇ!a$PyR{M,&Bt'CTp`}ZE+ KnVISdGS@ε4ʧTrd6 ͯh @Lrd s[S HsSE*"iEp+EkԻɶ R 4ueTRrlWI E1ml?11`ͮb&c~3,V J[V`+YYF&0Q*,/2q4:'FÐii]I8S}'?v??qVee.VNvS@p@ܙN?JƨY*yJCOQh3ΏGuԚ#7ZK+ƏGN<ɉWO݊Řɢ}h+mV35N9#髏7B_drkxh}0Veu7wר ~7TQ)u 34GI-F`ż8`p&)gFm6O!CH|ah Ghp5G{R+Onj~nxmjhvJF)<7c|Jr<5z;c O!AFKt16u:mVF)<7| 3MncuP:cf"(c#Rݦ߸nT,Bfb׺x^kްUdUh܂ZMDM_o;wpIɏO[OK{-- :q`i>FCapGH,O5E92|;L=SvA69CнFS'=G{|at\PF Bs'J7ѯSHPϫo]Ƕ:e[9\o]trņw oK O?t?||=Npe`,ɠl='矬 ۅ5H ~P۷o +g4і0>M=~!RMTUO&@:O&*tΧ˯R &T8}Vkvs 5iЍU:^%JI9$<| M F'/CFsmX/eaU<]i 8V~JrCqz7,6 @ken-m_z܀mQhVOB㺁F4 5YTFfN&hԗL4){azI,y#W;sJr}Cэ'ʡE ̾!8E}zTZ4 J` +.]Fȩ X8$ߕ &:Ý,y \Ϗkް*+W`9{ g“W>r< -(ȝ%m,(Q,D)ARRXg8G" -& ^hyPr~#i<pa}G.ʉ\ԺsoÿdA3\U}ϐ4}[`$JX;~y暋YrJxCoUF"Z aF}^!yYq FAR F^V% [ 4|_7֝b^tȁ\1HBi5n7=w]kU!箣C0az]ȇ-\l4F@C.dTd>ycBEO丅NBr KXq&\Hd[=1\r*+" <7G-P~׏Qi"C[/0g刳Y#qJ[,:L:ZM\w,HQ.3@7`emaȿT6O;2>1\@JmQV&` cʒ'ށ!&RWN%BĨ1I(]bbT-QU?sy'kgnCʒwp<" h~ξߦUSguI8PFE% Ƣs 7ޒf(ieF00sN]^ٱ;Ln6~yusS2 W[ţlN;нr'%v^7796SlkT-Ti2z;þ&\ocJm@c4*A}|*m6'U%s6( 4Fr\9V#hXyB5H<)WxP94&9A_z xRM#Uy!q~#7w-T%s-macتr'Q 7jeot<}@Hr^Ńf2Fa9S@(O[(ʷ&ڑN뇻SA"9ס3 mLƳȵ6{5\@3D8ba62[fmr%Lybfh1:MUϫt,F(+J^^./!'ͧ? 17\.ૌ/m+ ʥaGxhunV*Kp=!~(h%=Cr͡qpp%VTWOuq PzcjE"գNjr`uxř*pR~^о>n<3lN.0=OMyF!ura3 s93,G Bsw\ *48 ? o , kû7?|匌qUgK9uySP_vƳ-km52:]l4F )v ۠V2HϾJsҭ]~xsK'k(W&LF\+'M)raɸG ^ʦ"'r\IJ+W&X0NB7y"rIH{˭Q8˒r;*iVJ#չGPDX)q r(`B!fXvwAPvJZ= 뾫kVbLwdQ. H=yQ'KKUrrʨe[L4)zy!Fn h}W? +"wO`Y)\0@}Wu>wU}WWCY%I>%9VPeEe)DR t^R 뼽*c)E#30rUę1(L&sJw7`t2k4׆r\ *?_'10k &:/Eix):/]JmP_VҜhVyD򔧤~$p3RWe幅z[,6^T` ,Q>%]z2:yoyN/6Z_uQIKz[j"xy{4u^*mt;QOK}-QfsRduXs ^q( bu_ln>#`q:ZJmCjmna( #JfDzu5 +9y,P%\U7wn>w<@dϫn:c4eq̲_a&enqG1ssJ\nrAsw$fyΟ^yEgùus0lFQ;;4ln &H*γpbU\QaWŧ_x@lP ` T:Xa(8, '@B[Au9+™3.+[uェ έ 6&0%נf2ĥȥ<&(c6 SDT!y(5KRyob%F4ls)d)痃 VVj*TR/CҖf\> 8n.ͪ,K4p\U繄ZU[ _򹔂$(SFGN_ٴG+;-u2R"U mk3f;"\LTԬҵu6uľ'Y{` o,*!?Z7(c.K~#ZV*8 G/('x%5;hs޳6g"!Ñ('>o6Bٷ]M+!Hbjvk u.{$՚Fnj$v&'zU. t=lWI5J=tW7Im+C6E&bm>ZMz[܌{- HM&*}Lʸ#R䦤(`ᜫ<= ݶ GÕ &:*&"b07`@Y)2 c^#82B:P[Q1UL>_eu1zov/ٕ~U~ߟY L 3"%l$4GieL2c'd 1\:1C#.Kts@§ G!̚Q>7f쾁5s Bi ;!Čqg`it8*.RGE #-rK9P%q Qs`lJkFxut5Ι]WL, M΢NEnsi|5p^R^+! j$4s&!n = Qy BR+R xwFΏQw4#%~d83gg8lTluge"׳o@+  *8Nv14߼ Nv@2R\`Q1Ziܤ)7Di.*v XR.%B6aѺZj T` xG% 97F;o&@ M8|o,fiTrpM{hڵ5ܫ|4;S)kjtԦTH+ /!')Q#KL&B#sWH* IN MqmȞ?uw#Do~831|qpR5b4*KcÜ&?"&0pTFuy>.y8bxNJwفmMk4i 8d "At5.ӚU}7;]k̢Y;k,Mͪ/㾷3kH3 ,hֵspn%?bcs^{T_i;[~wm(wWPΟ eS*D6=L&7ZKnSZb&;ئɷ4[%:71(T7WXoH)̼;7k?s!PM{%'Պ]a+IH5uZ{EzcDC:{EnߏomWߜ^ѿv%,zEߒB$)4e}hmCT::Z_b6e\vC6u.=m6uRN?A;o0|EK3Uz,$PG@sJ m8&>(xIPNrJfs3UzZ1{l>J *-2rTZ4؁Djԉbϋ&Ob~UYU)S$8ڽhp-ް^uަt?^MgoEYU/Fu |?|GjX8@86$%hYӼOG=Ǥ:H4v%R!ncLe ZFrxzuWI% QBL*Yh6-*$@dG`W^?np*DB4~6UCwH3(A``5O>t;xnO\Yx=YoE|xrs:*^dbQJnX*I:ѾHRz! $[XCJJ$idivTbmWk''S*Ñ %S{p;JXvW|Zl hy̍SwIMF RxSq2 ѝIJ񴋳E{•ymXh* Phu 7⨧ݹ *OD(ևD(Q(cXR΃CO>yH~ ΂+@T }tcf(`tem3R*'w<!+8Lv;vrlĻC[IOcN:H)D@#O G_T䈲]VBVrc[8cn2 :G"ongO R HķP; 1D甠#tOٮJ$EdpIhG)i0GϢ?>B1m=/ngPipF@-< oDV,}!ZS?.G)b.ykESɽM$wp~jg{pW0/gu;JT.O˯׫Ts ])Tjm,)XS(usB>ӿ΄2/s?D{7L}$G?r^qqmS1HTsҭ-rX;H4r-ڏ˔n]pgVB(ZWڔȚz!Ve,#gD:ga98J9bހ%qDͥPPDuZ^CYZW@0CJ̦X!!IE/WCl`(B#F܃Oq*v4tn Qk0x$" 2mD<#X1)]D"``"DC~Y.MyP oitg2~,DyiTrveiJmiyc~7`CHڣIsiB4ce|v' 3G݇*>{jɪ =篰Jo - ;;uv/w4O静b,Wދ߸ Bo%]f*ȱX2n\awZ%RxzK!=(pZx [1(o9@Ed ӶI&a$iDN:P[='?vG T'Ls :)^)Y'qtC7;=fP4Ƙ g%pKDė_~H|ʗl8ub!#$2QJj*0JAaʽj=ھUGۗTQNO>] _x~zvm TI;mHKM00µhbJLgy?UPP8p+4l5D7*J론 mFmP$9FD ,fQU2ao|ޘrFK/S exlE8oBOWYzF/:b(vMD-5W%19"WyuYoe~3]chɼHD` d#m\fym~r۟sc^7qC*' 42KHX";cwhj.0~p>h+ ٵCor_ҰMkXX, T. -TQ]h<{CrH? D-M?|I xӃ4?.v,\~s➎U?>lܺ~緷"|<зeX-nw.-'Ue4VQRp_e-,ۿ]mߑJjNHa/GD kŻy3h⫵q&~Ch22a@{Xd~'c0Ɠ=sMҩ2\} DuRc&oPGn9֭ U4I=} DuRc&"PJf'Z>43W:ŤƷO[.+"Aj23ݻ̐MFߎ,vFBehK\*e+%Cɤ2J;Za['\{`Ld4C.Y =/|2o0>.nַ?U.|eOdC-2W֯]寷~((1zڀ\I*ފl+[>QLi2@P+JqПҹp4A@A!51aJSTVhc"T ɡh jm`I`a)lV8YUn3dZŸ G $5dEHKfE YIpCVѥx%j fRFP YM@m1\J$8Lj–w C Z[QPJOnQ< [IJ<q+rvu]+K{F^2 q,/8:M=&퇋uvY_ Jt[44)k}zϷ)oTx?b7}PO.WDK8 F6{*ըw| $f{S*< Z|. LȡNl"maF%Wzj껬ǟ߾J!\a0i?+I,kXbPfj\ɣưI L11L0q[pS'Pw7o8ˌeŲ4NB%̍˖e4+m&Gq_ (yA}.5d$ui D̩t ?( $#RC!O8xF=ށC6dPw7%R* I,eq90:.诸 4GGgT| !L°)3cJcBR+2#4D!Oa"1wӎFR)iS O;?w/*7Q{㎙^aJG;{9KGAæbd~ JKE01ӎu*FJn:D'=4czKk- [aMDά8RA'.9幒%w"sfKR%Xkz{6VS6,.] P 8Agŭ Vrd#-nj'+a,nena˜J0< W2"'LR-btm4St[- eSڒ9aKݲ$,,CM?[j!OuS@?'nY{0yԏE ŊpӜq;l:3 Qh +opsǻ<- $^R|ߢ(iT_=\~zwq#3x/zuʺm\|%~ES')2y;TRjtVdD1@ߙCiMsHJ{o y =X.V[Mمej-@! uU$~3)CuڪkTţ,Q×jALPFV_Jr [/313!nP䋚^/$AHQ7Kd<2VLiܪn@{/ƈb"k飈њ4qWf_=uy茏pX3}\Q]ލu _wGU ʃ3zmӚsWP$ϗY KIWw3ڶ'm5@p(k:854"P"4aƞE50cQ2PKk y1]ZqX[0j˩GgbN*':4XoBAr=7ҜQ sSuj ;$Jp +'QX)qѹh?(ʷEsjB0R1bD8zS,T=PJimؠw 葒';@8]<Cf7JG9|6-z%'ʘPUAQ(+\(YpLR[J'3SXtJ+22^J+3g!qf3(Xn0N5 5,=a|E :@w"WMꠟDZVk3C! r?ȴ1;r\:$D`S8!? =NZ&#bwVTѥx2S))&>2PqL0վ NEm%?AbwSh!Hs]ȿj:J:5E0]L=p?St[+18Q8"J*](KIJυ*m۬(LQMd`c>;F:?*ݩc QT+BR@J.PoXgR:[WP9&P=5j 9Wм5x$vD˙U+pCȔKp$Et˹eE!9Hw}߳6@ŝTkemƫC}FE_ں{)+}櫪pw?ۛ˗?2?$Wg{,x,P;/n]@wS)k|-<o/`wMUO{5ne)IC>sM)NgF[7[*BT'1m: e [BC[hNuF8r-!6eѬ[BC[hK{&dbg"^7$n V(k?Eyj[S/eϭ6p馥1:g,s"崰;()AۤR0Y(nV\R2@g"|5YAv'A2<Fⱃ&GRHd2+uVE4: Y)Eﻝ@@3 ʗ3azbX*Sr7,f 䃴K} zxAspp610:N)1V cV(VHg i vة0px,"+Mm eL]ցgG ֪?V*Rz4aOň [(4m@+BZه =&JJs`:mOO]<ȥO)`0N`{]Q[Ԟ$Q 8WÁ! |owm8о&W߽9{j|q= suK8 F"ylJJ#@(]q@,) zE$I6A:IdLA~lˆR/Ʀ9/%LnH IRPb DYޔ q@*B=j5H%/eEBmwn1 ?ADs */Ē,(-$K;B4<מ!Oi. \`~fj>^)|<~rL\_c^S0ؠ)-BVt)Y2^1:mkҏ\]˒fYΖ92˜I3 4a0eRܬۄBA8) ɘ2b,֋4yjH9O4I)?} *5:+bUNKAI]v;~"JW}&1O) pNɁ,K*d1T50/?NT P<1UG#Qj=j./?F)_5fdQZf8#YO}(/rn" i3:4y ɊW4)ڪ!''s__>>Xc?1lB=T>yۻ/qND8w!iRJpp8 /K)Zv|Y`r7$OqZ(OocVІ jIdq5' -YsX :orVt%cj%#VVo#{-3X 6JFrRIfԞ ?mfZ'b&@!S yY`i\\]jwBpe%kŦKہG4>R2$OBŬk <v"mN>z GJI gKFe$_֘IωL_b 9U!UY(rܽ+x>=/2:{_ꙁϬ` E~X/.^WΆu X7v->$F}l`ڂ3MǧIII? dΑةtM6\VF}Hǂ"d±2sQ. !Nx\/iµ-a3_ jpV60c>挠1`> qY'RQI#J{/kGQ{:+`7YE:>qb*v><|`uB"~#[Dw>C"vu^F39 8=:[񔪴"m_;=,weYZI,/| )~Kk~ 3 &K#}_Vۋ}iK#OFĘs %zisZy@!9rËCmo֯O {`C7kY1:z*zb<-H&B5BQp: z8W4&Eʧb[mvG5\{oG*dM{e}9{)fHb m ^xr A_9%vvtc *0ޢJQĦ>$[дf\zF.Y"~cl[Ȏ0RڊzȊdF*+58 god0VW{ZFZ.IQ7E*ȥ] UU7VbuJN!d!޻*p.ʊuy* \Jƌ|7u8ߠP-G6 Wݷc\z:Y_\O䶮/S{ \ zQ0@1OZA&(#y1\Ef/Ern6[e[;[Ur<|]X`O<2m>XQgxolgmBgዉbGHF]0a!x똻O f&_Űe{MNP~.w~>}0Ωq02NCc)apI+.ϲvIN PݗeAN,Y,V?;Co)Фˍ0 ik-XU7?p4pqW9褖:QC -,)|rjg>iJţnБ֤# )yK}X^)x㖊+H˭yX䯐5 I: mFGVR\p:htxIBX^C| +ǧ)J5&Dd]C`++Jц/VEVa~?>kրJWJj qߣ-NǟG}=,翛D`v.)^S5.]s8Fv0Btw9\Q^']Nr82uF`Zz[ χ #(5OFD0 IַL>pt08(oӕK2pOka؜ևLR0k^nh _4]z }aSy,[;~כ_Ǿ,jXtT}5BNKwEfzm^'q19+rC8-& 9Kb=!)Hk/8|s7ċzB-Ve)96K_t-dd+J{΋0\;++ HYG JI*sHsSd?T~_LZJtB=]Ɛ9\SYrWIEGYjrAE'|ۭ¯W=u4Xn"zH -O&8ɤ'rb`'A CdHbkN%F<.z'ڠKq-1yH YQQlBYcc stgZ%8t63$c SaLN6/o5gh xM=L氺 ox,Y$%+=3hσcbFpa´ZyK:#Tn59_XTK"HeT Y >pYMհq,bPG0 J^n;59~<~уSrO{EFDK#&0qp&\\]~u;ʚ[兯"d(o:<4-5`W"7)p="#Ga9x&j1>`u\RI5]yFS.sԋIa&L^gu?  D)uya)gߘ+?W`[+eQ w(@Pcq-ERdՆ0W)/~o!,9X) ff-Z}\_AK\U<2wQJ8s#Ed)6ux7'PrXX)[㸾Nlۅ*~g8Y;'J\\c{b }%]7u$hɿIN)jB;S֯ZճZߕBRn?:`EιdUζǶŕ-Lcl/[ |ۇJNDS<]văg2>_yTeGb],rs]ᶾ{akgctM> { Ok*C&Rhay?0Ng9 Fv4 F0h+ݱ,tb}ܒAFN:AYVxU|6^g`f1/9h3';~E<;:RqqBnjBd*9]l-ᥲtlr^+ucMx-*7* NGVABHJY*vz띗W%g5 5Os d\/Y*岕s{FC{`!A"|n0V+bQ˽52r@sB(-!7u7BSA1S {Le,6ww ҀoƩ).Ƶ29`pq0rL[jE SwCEeKY(atN:$fI(Oc*NZ=P5.Ix;5in?f=P&@`ʽRZՇF( -kƦdW11^ryϝLRXx׹Z0/QR,)..gj =}n^V,B2"!uedqNAs1HjnaޅWբ{|aiK6xq˹8i VȘTa%7ԃ>D{M/YOS+R )G1&Eਫ਼\`eS8.apjʮW,.#A+N2'.Ix!>Me͒<\XOC%&*ukly>/LJuw"aHw7o;1shuy1ͽg_n4Z8ts{^H`νo`[PyR6߄bJɭ ߜ~Eb0\eۉ{08Z_g8>gKñKL?n>_9"qC#YơQ,|'h0tk5wxy4AI] G;h3q-{tF)ҥs**B{a-+^j\5@xब ) Ngp,2g)00)#cwSba]9|x$px\l g:AֆdB`lvPO*c/z!QHCjP5CtenB(W]f͛BY9aFbeBr4#M9b;҆^6f ZʭNAo+gE B3 pT22;/C[h!! U/0'j9.~@C|q]6s 0> 䏦?6͌:^4"#"3տun #h$. 87I,) RJ͞dv7CYef_ԅ_Ga'p{S,,?})ZHYfͨ7 O'pgHQ_pAɖ={{eY~ `Ăa4,3lLF31-t68!F L x^FbTa5cQ%6ˍcFkC A I)0Hĭ3BX Q=10!E>ZKj=e7LiV^)% g]/cِ)`EZmJfFǞƍN]&;M)oĹbxa2G|zµ)q>LdSP$*-^kI6_BUjY7%O9.iIb'tY@N]V LdkY0 H%d*J:9,پ"e>MIQkUB+ HxKC6cv_vD_?7-*̽؛HjnohYC/fXn*{ݘicϬa7V3!ؖ]z%c_xF %JJȥ$P\AJύ+3C#+$C[Dg m-$4z'ӱiSiϙٗΔO: ưSKs7k@B!W_Ɍ>vUGuo|fỀ9 跕.6q>(P@|Nf_\?Ð>~ȉF샰C7 zx+] 3|›oB}^{#44j5vQ!vށ|f83l'Q(X3ݚwZA?f/\M* Q(u8h3 -w8f#Gn!+o{4@g3?Ȩo.wf3uQS޾dgSFƋ6&7Y{?ٹxSo#^,yYϟ'ۮtN%6Gȍ**mliR*FcE QgXvqo~>—N7b}y6Q9Lfmaγl/gL]/ߜ@Z(*<|bՂbi|՗gj,reo~bo:Jg7J0ab-fֈ֚^L eHž5$ө vQ3dͮbPyy+}FSdI B.۸ eo8Yi1x0:HV"QNIa!T~Q[H;كXg[d ͑X3[=O|ćŵ>30Fi0DҒфcP"4K6RASNUV!cvHwo=6pK#CF*T,y/aHeF*XYՌfrd\܆.Q4aO҂U^{+_Nyšd)Q >A%406<OxIDb"pA$S)))fROBGnO2ZF5!YIF+R_\)cϥ=1|%7"Y Kw7kI:f;SwF)l0seZ\,BBuozvz\Çy-Y,}w>=n߭OvJͯ?_T".߬ҙz0f߬f/Uf>w{`%zʳ'ki_fvq7q[bTn깆6^+&#gPg#$Z`5*&L"1:.} ei!EĥGȥuϲnBQoϔC*NbkbhMC.rȼNeE%Ԣ-AX:%eJYT4>%/&8)GΥTQR.ApKP_,CMqңRR&PJNX֩~w^7nU7d +"lBkQ@ x .GMD\!_|ʷsɪCL,tּGlj^Ap$切DhPsS@~F$GnP6aٻJap'.5+x6) s]}وeA >18O,Jl651hGJ<<^jl,FE2aD@f }!f6JL%k.˛Q ;NVZe>H@PP*NoEdݨUAvջ9BܭWnymhWV-1FmE%~α*owzX*oې#8[.mH5nF xTUnEM|g3%=s-FeY:M%f]MDZ5PeH˜|BĚ'*D̛JXs@} u&)'/eR.f"t\EΠ&uRQ)K8Z@?*?}6?/y5M6fyZӈ4\m9^_@0۹~ mNeedd-`xC4]lb'AAuK }2mּ1 }_yÜ$Kst<%ˀѲ#N"zK9VZ'莓rA HŶtL98fH@ڷ9@& Qڥ|Xt r*fǗYJNB(&z!Ur|ֆĵ_DžDmD":g3YPs.&&0ϐXB$FF$:πYṥRn .'ѢgJtk. !HhT~pY/w"d%+k,|iК3pħ/_?-s.Nջ~!a\wLt}'9|u~!ԛ-tYr:{ks;G>WwT62O?>{OH3A#3%l.|B㊇VZ2 C(FA)lFɣ\qRYgΐUe+/(Ijm{ }|ۅ~LmTA _}hLqC"ͧ́7wѧz0MNZcRtt.{@+Ѹ+fqYӕ$:zyG`v5t^%j8Ǝpc(IF#2@] =}$e!e!e!eyPu22.dk) *)i(TddǞ}ѠC_BMK-{ZsiBUbM&vui\ZcL.pc5*J;,X};ٍIYaYݤ|uAy fY ٞ8~CP!p3 ٬4 5$ Y`W $EUdB/-Iu$жjN-T'(]'!8 2Y7R%Dq4F#Re,jɭ#Ҋ4tt6(Qmb,&pyn 4°Ll5^s}nG1UXI+ĭuez.}Prgmn>@wlly:azJ~zfnWo~+4B?;Ɔ`];-@=H{c?z)MsJCPeF%:$0xiBҬvmٔ=gaKy'h Om*zyqH|A$C5kH(%$]^D|)'*B))[54fVL&$Å%N\_75Ƥ) tzx2Ağ]4J!Ӱό yZ|2*1K]Yo+yZ\d \@j;%G 8llukV5_UV5(1HEEӚ},JI i,zev:'.7m)Z1J6Iu+*zSr[A#VkScZ6}-Mwg=(֯i]CrcLXϐ**iTQfM_>JEEp\`\V4*+Ѝ5sv@9o#Kve;(⨖cME'h\=Ɇ w 淓'nCXMTk21|=܉ {gO Eo^ qMhŹ&Ɠ)LFHKXU Sye6|:&?ytqX+n'Nx+ue o^5F{S)y[ Z.ظEjtg79^vt][$\Ӷ_*iaGl<#vjR ?_|y](~8$`#njF>d9FݑWk:Q(+{cvڪ1N 1\z-oH+=] /W~CV:X=,89K&Jp"AG]d܈Z|ERxĊy~y6}#lu;'E>dfΓtT}45ѝju?q-}W(9}z`O`n=:5ܝcؠA :Ho<*Bpyk:؆X5J2XR.mփ9lSY%SD)*hg9# ܐu4M,{\KDjzS_}h-i)y1(oB!cAP.h7.`e@KP!^`xҞD6־J}!FfmHR,bFȽ4M~pȱA26I[ 2K{kZY:HRn;~ǽʵ-nzdy}ȷz>jO0wAиԅ"<։j ]حK!Ok]Z}zϠdUz\Wb9Vp;k6s4m@w/7 jΏt{-\WrWHMplk?ս;6C+2۸nWr?V[C"nȯo"˵go?5+?i"v`9TyqNf6NuꕕZ~Vǿǘ%Y%1ȂPh2/7XN $!"G7Mrڭ/ ZD;hy_'W5j!$\D{8h7;֗-] >vĥQ !!"%SU^p#ƙaum=Ԯe@QOaK{{RlT|]Ϯ@s!jm6岶YMkit$6ɏWrVpģqpt2 _?-D)d96$X8X#hsQ:D-" SY~ rF:!PӄLctd1&U2̤ $Ez[_d"eh}r#:!ڐ> V8n>,~T[ !*em 5k6[.Z Kv2^U߰K|XQE8w?c!V",R6bFK4Js9,ebn(3>F}ycGd#A"P'" /:'Q8p]Hk@٬9wv\ަ5;PO[+թ-j<#CS {`d\0j[e֚;iL(6eS_KN$Y=m*21lDn<ےQ (\1̉bqn}"e,S֣ca铦',+0~r9vX [ k(μt 7*ћ}ky sBr74dI}b,(&@ i-)LHE'ܸ1Z6?8Sd7r5 Y9$IDIkA,Y+;~V` N~;DiZH ^OCE^XH[RF[jƺH\A!5+<63%Jrc&8RQy cD([FUnzLXm.?j&FLdKX0Px_iHGK!P0jt,e<`s%L!? +. 7Q%*vƋ]pt|kuku3u\])e+3^ԅ%Exq~ɾG9 SP|rbpF)*L l p(wntÓ N×i~.X9txk6yl"ըð}ء4-挱盱BM U,<:p)!97%%9(93b֬蘋)>?K'(Pv{БvC싿w;!Drd,q5;`$#`ET![1z.3y2ᵎ$l){WfyG:Z(}/^ѳ>|iFxy-h FzZ"et&!2ѳϾK .9u&Kw; Fq|ccd+ 9!"Saÿ>,tiClX@+d5Ӵ{1۴@Ƕ׃@I%'BEakAɍ?S 76sK? Ql+A^c+ω!H$P&Z;rt}y'ں+.lΒB9Xf')ҹn&`v.$g P՞g%oFJkO`*rtդV!#hI44lmFTVa- 8k2c(kV)g "( wTnP_\؋!kսi}nWnX`g6 |zuŦ_c#v;/wfZ]p+8C_x |=f4\~SAЂ8PFM)m˂MqGIbtdXDSŘ /'ih\!Fg-ioZr!Ad__*Eg¡++S(vZ9xR;<1` s M NԩO]%#R(:q@Mtkc,4Օ:;{x/:oWr?F(%3&je!* IWYtY}|)R',0>z/.8 WI,19+l ~@Inis b`fm5\)bbss.S YY.8nG߹7t}pwn(3%7aZ+_)B3rv},e'JKc.>9ýiU%Cbgd܁\63\h M"#px^;^Of4[ 4bkΠ~!LۣWWRJB_Q: w*ZC9]rn@η àςTx3j7M@&36'-tI߾~6PrJ*d(!H\xbKkbNu/m t-0.nEHLr" z[² _ڛ& [,ا]T86bH2#NE.'W蕡G mYB_v3E}M`#e= re/njIT7u$$$xG;][/YTiK.9O X6^AH1˔Ϊ[7G0ؓDpKhBU*dA* :U=4'`2DodeM/M n1o-{>̱'1!"9( "")"Qzd! ;)O+hg=Z-њr[8! MP6GK<% bR0n5*&8O ncQo̓Bl9 v t!XC|T9mHͥ+7޺T,';V0GxX57DŽ\ÝQ8'IƄ oxYsJ(l:oynG,Xǩ+  ۣ#d )B"`<+ܨ4_ׇ7ksv$.S 㬘#5BH(g2ͮORH`l' {uׅ-^ɌWqFoᦋEaA^nA}\C3qAE@B͇EÓɸ`H(l`I-Yގ)Qr$nbl)p}ehb 9Ȕ.Q5i&\1D`C1FU3\fHuC ʹOWӘ l9,>'7envVZ9^7A?:9>mеC{jX@w@|mY- S Nog\aF?~@mSJ҃U[ t-*۪ae%MY$̯?|kgheZ;GP#8'tg3u%}kBz{sKbXr4q= kɆj}woL3ٗhG/.OJ ?هἶ-!Rjake=Ծ7,+k~eo'k\=)T(NOڞHF'gg['{s;X˻~(UCUkN0 UR( , pnV°4T;%8A:<2dix8 UI'}sybUjކ|Zd'z1dRC; "r5"nEŏ4go泥ߌk>Ͱ1\KE,Y epOGzjb n{99v8.۞W B ,I׺qY}|sC ɔ8rʇA\$-\#q%,0S38^ݽ1OoLudzjxC0 AI'PUҀtޘT✻`֛zkeW>!m3Ph-o 3ɠՐu*>+g%вerj077j%ΗE Zt-ٜ0 Ѝpޕ_ L_VzG@: Rœ׽yG߹B_~凥˯ Yaް-%4'~8 yrh㙛ol)`Ld i2IJvkKIa)+Ъ^%,SBz!t Ӆ &(âbل I"{QD|2{=XQŅHрQ Kȁ) 1]A:ӃyS'8Q#Է0Dj@? 1 0>Xq;g<ŎBI(ԅ9XxqX`H p,xÞ܃&  IiqrD p\LBZZb/iH}/8m>0a[0#ǘ'[Crm.ӘnFaRFD>.i(&_ 3ȱ `yAhcEE !To+X]_{+ңōf&,%d l$prV0pH:7vG͓<|< A'8V: Fi#,70$u`g}(s3%~`βN.؃LKi%kstͱ;X_9s3xlb 9 22#4pA+Zֽu:JGg㋁wG)֠t 1h]0mwvRM۱ߧsm &h@xsUc#0>!`r8w~<($*2~ Q)% 6p7{o#R CРГT3P1QH ưYbdأ y(xd Pj-,NR>*(^ip׌7gYä=sHLf {PKNf s3!dɭP96 Z]3Yp5ˁ{aTb97WVefm&ضwRt=xf#]ֹJ$_$o+d?Hys4ZDPn oCzQYd~O?GkHV:>;_]jLM7fW{qKZG67P]\ FG#hKZlxh@)aϭxP~0HOLB07{}~RqBT2IXλ#'׻w]''`M{$yk\Oj4X;^{-Z6?޵,bK4՗* u<$r7$KKɗup{z(QC%rn]U]U_[ zڃXN']b66 ywjR?,3WUś8?{sswy_mRr~W:2-Lew "sͯmx.?*9dӽn\Ca-EʵT}6/,J^ytQMHMtMmwz7dw*>myWѩ[M6|YwB^)8x"Nw*>mV ;ngMWz1,䕛MUvI4kb֝5?[GyMe pa^/--PrƖmQv~$~IP9@zI+LcLЍot9H179kҩnڷZnoD)iJChQV{Qɘ,mwܠ!= 'X`z>U)0^i8#J,)AbӠkbFpO@=V`ѵXP/8\SA%  R*5=WV[btF_?k5`w_*(eGGOK3?(kPhx2ȯ3vyFXH!ojCrkb 8@V5&xjt*C&@2._ə bf#Jlr^׆4Q2VIf>E{Ӟim2:`GhَDRH}$X!ݯ,U+fFd#8ccWV]YH-H {fM.U$"R90#عZkH_[QThqנ{Wsk1ST2i [!U5'C 3I,gCP%"$"E͠`Ǡklga5J\۹$ߐ&r v.EǫfVmJvZB ɫ#5ҔZΥ"ɳg*'zvBGkauTb ͖W$A%j6ads8|* €gPbǂ0[˃rϝ $l6oh,%Dh'w=SY*VdIj;P=-~L,{N04+ 'Lǎ葳8Ws) 88anD (H(\5C4HD'ўi*[,]^9FdɊc;eE$B* eFbQQÏ?rޢd kfeg&; y5~?6:xԴQ[ytb\Zz3, =!;H;>ǎz_18Dh<|}7S+OrCǶSm䆍/~6 9nn)AM/wi'Fwx`lB^)GIw*>mAѩ۝w~{]ưWnCl| 21`mhgmx\wB^nlS%wZBuD&`i{EKukXŌfw;--Shs] Ztkq=#(U[!iv-AП{sHMQ?H7:Nͭ Ǻ}@ &&NԪ6*-Τz[ST[+Ҷ- ]&q<ߔAgyYFyZ `ӈ&j8ɲV1:Yk{g,3n?ķ?f=\2T De6eÏKI/1+rI!K٠(k׊tVI2#siڞ떫(IW\@{|Y0`'"7qFR*i6L=V"PmyqFҩ!]r/+3A*&h-Th9!i~!K%=l2[&9U*h.d({ Yi{O;z4j{|={JcF?u4HЁdlLfO7᳆6b?!tn/c>fYqmQ:*J V"\fb'7_8 v-{ϳyj|n2r*ylŔ''*!FHs0'؜,ۑu+2Άl\}]%knq`+CYbdRԍS4ˑ0낧~B|]oeq~ҽ?~ɋzu.B.PZt7}Jm!֎ Qьͧ {J!wXw<R[%ds֨Jm)3Jhd&,%] k4s%Y';m (n;"eQkۯrYȺ"\w囚-E帞sߖU;ї-Nŧӟ? !)resf QIga1(9@*׻bz<`scR{D Jp&Rbg -vEK$wxN95[mdH1uu )] unΝa!Z 6L`tάNy -IoRX鯯ƔW/S_7[!SUFLmx+z~ҩy%l\MȟMlks96Sr$(x~rtI k\ErGg ""T {HM1%~<'' d ?4MZ 5'kXsKTb(@ph%)KpDeo~肸9Z/ ]) =ʌT3I7Q/Ŭd kB@Bm#PAm6 xǍ ilP_—IBzEHBpn5UU+eoC%P,8%OA%0(ugnsfc Sw\|ggvpaʄ1-@`t{dݓ*R9c;vxj{+_ųIS~F 78u78U7&c|^qX Nِna *Ǚfs sR[߶RkUeL>YnZ& Tey}孞'zfTܫ]mfҭSܹs?[k S(3.RpSOw׮+W\I Oؑ9Ҿ EZ22;rQ DFO_QxNCAΓXW)Z}5qG4Q&Ud$+Wyͮ J8^Y*; 6eP=0vu1,Aou^F-K>w>wk. %* g*+4Zxd.voK] klۤJ:`qw*Y9`[|H *= +Յ3vr';ꦯn-q6Ghu\}GFw3(ZكeCZw{$3{鄤֎IwIqh$ JĦhs6 1IK|/6d!HRf5HbZ%;-F1slC_d:F "WePa4SC;rTbNV! ˩ fQO pj!{Wƍ/Cneshp%hR븑eגE w%kfs:X^q3 9 hÿX8#aʂoSV0a$#P9,/ 1dlh8]K,ݥ_1ɿ^ OK`lBfeF'xz`畏ޖ/^߰7-?T%+%]7>]bU I??Qy7'WwQ!b+AR J(tL?2YIdpRbBO&84q"xzѻ ]/,6x%ww _?.c c ۹s K0bpP{o/Ӌa'D#ѭm_,Gg>I֝aq }[GZQF4'.4 ^6Iӌ%Eq RTMoRҟ ^H] -B)9$nxX h~|:KGw?Y?}..sO3k.G*THYϲ< 1jZPȟEt.Ve:#mI7S4#AdªT*e.=U-ynnjnEy/:=?;v+y\WhT٩ UL}^_ͨ6+Z"bh?ؔhFi4{lJ p ՌZYu qz_\¯Ҭ)Ђ%cD8x7ɉ1"$V,'-YQ y]o&SG=ljҋHT/͝f3); cPicg?;M2Jil)ݐLbkTy ̵" sR3,Pau9X To@>2H5K76QA@G6\ɜeEt.i@;ʍE@-u@Rfi TٹG"bDЧmcTf!7,öyBn yNL N^$섙ENáiW]]KTAk=LRu4Lg~ȅv=FLTMjb&'jycS[iϿ䓭rF`H@Pm@}ENsT& VLd #4(<¡C 9Zʀp?!oRw_g9$9K gF2]x .4u@rTՙCM`=V߄ъ}MyZesT-4p~^;+Nh-%gx|8fo0!̊R߅zvRn0AMd )wfvs0eDSiϕ#~Q@odk=_k灲(`E[(|3&ZjW၊6ih.)Iv¼I'RX Q3fq[Ti,aR׫4fc/2è,H0EAXX!9!W, ./j'Ԣ,`f1r=X[*a8FfCşy/kҊֲ@QcBr%3@K:eizkpA|*.p#o۵4gJRFxSjփdEZp'QMRsi 3RXc]$SUSJ9[(hQXZh̊,͈5g 2ZPT[LFl=Aե0`t.~/X)6׽AQ33w["~Eu܈$Lאt~YU9~{XjrS@i "u,|KY 6F(Fc%f[bDt+6՝B,8 ʷ8;udŦGXͧ-Q;@$ QR+Sx ƨLDa(nSÐ_xLΓgs ڥ@`8: = I@xELBsM! zהXEitkFdRJZ3Àݯů yA$$3L(KUHH"'\-YKy]6W&Qosz؆w/2j%#AyM܋lF1QV}&~O0DZAAf*Pfb/̱V˟ퟓA7e߂F1?&\̘M?/}}}}Yb6IuR.EP;FԚ<kInLf?n7!TJUr'd4Aɟ{L1mn4E b(mo^^藷Vr83B #)07o\جKX_A ZnD >o>ꂚFLdXKY$wX1bq1!,K?Q$zI?-i|6+[;֜ůyXtՊJ|J*a+3n_^3!l!w2dwp/Ҁ0on.^&FR`0`NgzGƣ#5XnGoOUa>X~bCNParxZL mKTQՏ#DڢPBxۓҔi6zumD OaHh>eЅGBV{xBWޞӋ0Ӹ z8"BFu(ҤPKQs#a&Žy҈ map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 13 11:48:40 crc kubenswrapper[4837]: body: Mar 13 11:48:40 crc kubenswrapper[4837]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:10.494292214 +0000 UTC m=+6.132559017,LastTimestamp:2026-03-13 11:48:10.494292214 +0000 UTC m=+6.132559017,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 11:48:40 crc kubenswrapper[4837]: > Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.389365 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642b51b70040 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:10.494451776 +0000 UTC m=+6.132718579,LastTimestamp:2026-03-13 11:48:10.494451776 +0000 UTC m=+6.132718579,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.396337 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 13 11:48:40 crc kubenswrapper[4837]: &Event{ObjectMeta:{kube-apiserver-crc.189c642d82253ae4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 13 11:48:40 crc kubenswrapper[4837]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 11:48:40 crc kubenswrapper[4837]: Mar 13 11:48:40 crc kubenswrapper[4837]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:19.896916708 +0000 UTC m=+15.535183491,LastTimestamp:2026-03-13 11:48:19.896916708 +0000 UTC m=+15.535183491,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 11:48:40 crc kubenswrapper[4837]: > Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.402550 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642d8225f364 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:19.89696394 +0000 UTC m=+15.535230713,LastTimestamp:2026-03-13 11:48:19.89696394 +0000 UTC m=+15.535230713,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.406505 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c642d82253ae4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 13 11:48:40 crc kubenswrapper[4837]: &Event{ObjectMeta:{kube-apiserver-crc.189c642d82253ae4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 13 11:48:40 crc kubenswrapper[4837]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 11:48:40 crc kubenswrapper[4837]: Mar 13 11:48:40 crc kubenswrapper[4837]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:19.896916708 +0000 UTC m=+15.535183491,LastTimestamp:2026-03-13 11:48:19.902502782 +0000 UTC m=+15.540769545,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 11:48:40 crc kubenswrapper[4837]: > Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.410940 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c642d8225f364\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642d8225f364 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:19.89696394 +0000 UTC m=+15.535230713,LastTimestamp:2026-03-13 11:48:19.902559993 +0000 UTC m=+15.540826766,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.415551 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c642ad2f0f591\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642ad2f0f591 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:08.367543697 +0000 UTC m=+4.005810460,LastTimestamp:2026-03-13 11:48:20.166859084 +0000 UTC m=+15.805125857,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.420690 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c642adec52071\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642adec52071 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:08.565997681 +0000 UTC m=+4.204264484,LastTimestamp:2026-03-13 11:48:20.361068104 +0000 UTC m=+15.999334867,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.424807 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c642adfa53139\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642adfa53139 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:08.580682041 +0000 UTC m=+4.218948794,LastTimestamp:2026-03-13 11:48:20.36997771 +0000 UTC m=+16.008244473,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.431790 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 11:48:40 crc kubenswrapper[4837]: &Event{ObjectMeta:{kube-controller-manager-crc.189c642da56d5244 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 11:48:40 crc kubenswrapper[4837]: body: Mar 13 11:48:40 crc kubenswrapper[4837]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:20.488843844 +0000 UTC m=+16.127110607,LastTimestamp:2026-03-13 11:48:20.488843844 +0000 UTC m=+16.127110607,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 11:48:40 crc kubenswrapper[4837]: > Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.436683 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642da56e187a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:20.488894586 +0000 UTC m=+16.127161339,LastTimestamp:2026-03-13 11:48:20.488894586 +0000 UTC m=+16.127161339,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.445106 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c642da56d5244\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 11:48:40 crc kubenswrapper[4837]: &Event{ObjectMeta:{kube-controller-manager-crc.189c642da56d5244 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 11:48:40 crc kubenswrapper[4837]: body: Mar 13 11:48:40 crc kubenswrapper[4837]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:20.488843844 +0000 UTC m=+16.127110607,LastTimestamp:2026-03-13 11:48:30.489017877 +0000 UTC m=+26.127284640,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 11:48:40 crc kubenswrapper[4837]: > Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.452094 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c642da56e187a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642da56e187a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:20.488894586 +0000 UTC m=+16.127161339,LastTimestamp:2026-03-13 11:48:30.489077039 +0000 UTC m=+26.127343802,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.457582 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642ff99eed20 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:30.491315488 +0000 UTC m=+26.129582251,LastTimestamp:2026-03-13 11:48:30.491315488 +0000 UTC m=+26.129582251,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.462624 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c642a53d77337\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642a53d77337 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:06.235165495 +0000 UTC m=+1.873432248,LastTimestamp:2026-03-13 11:48:30.626406165 +0000 UTC m=+26.264672928,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.469373 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c642a68d15582\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642a68d15582 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:06.58708621 +0000 UTC m=+2.225352973,LastTimestamp:2026-03-13 11:48:30.831602585 +0000 UTC m=+26.469869348,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.474618 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c642a699105fb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642a699105fb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:06.599648763 +0000 UTC m=+2.237915526,LastTimestamp:2026-03-13 11:48:30.843791993 +0000 UTC m=+26.482058766,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: I0313 11:48:40.489028 4837 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:48:40 crc kubenswrapper[4837]: I0313 11:48:40.489123 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.494892 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c642da56d5244\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 11:48:40 crc kubenswrapper[4837]: &Event{ObjectMeta:{kube-controller-manager-crc.189c642da56d5244 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 11:48:40 crc kubenswrapper[4837]: body: Mar 13 11:48:40 crc kubenswrapper[4837]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:20.488843844 +0000 UTC m=+16.127110607,LastTimestamp:2026-03-13 11:48:40.489099458 +0000 UTC m=+36.127366231,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 11:48:40 crc kubenswrapper[4837]: > Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.499619 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c642da56e187a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642da56e187a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:20.488894586 +0000 UTC m=+16.127161339,LastTimestamp:2026-03-13 11:48:40.48915876 +0000 UTC m=+36.127425533,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: I0313 11:48:40.982127 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:41 crc kubenswrapper[4837]: I0313 11:48:41.248938 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 11:48:41 crc kubenswrapper[4837]: I0313 11:48:41.981022 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:42 crc kubenswrapper[4837]: I0313 11:48:42.980784 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:43 crc kubenswrapper[4837]: W0313 11:48:43.978894 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:43 crc kubenswrapper[4837]: E0313 11:48:43.979211 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 13 11:48:43 crc kubenswrapper[4837]: I0313 11:48:43.979359 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:44 crc kubenswrapper[4837]: W0313 11:48:44.701753 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 13 11:48:44 crc kubenswrapper[4837]: E0313 11:48:44.702321 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 13 11:48:44 crc kubenswrapper[4837]: I0313 11:48:44.978629 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:45 crc kubenswrapper[4837]: E0313 11:48:45.133130 4837 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 11:48:45 crc kubenswrapper[4837]: I0313 11:48:45.544911 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:48:45 crc kubenswrapper[4837]: I0313 11:48:45.545208 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:45 crc kubenswrapper[4837]: I0313 11:48:45.546832 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:45 crc kubenswrapper[4837]: I0313 11:48:45.546896 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:45 crc kubenswrapper[4837]: I0313 11:48:45.546923 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:45 crc kubenswrapper[4837]: I0313 11:48:45.548023 4837 scope.go:117] "RemoveContainer" containerID="567274bd739faf34091c836a1d1ba1184d2ed741ed419f592fc7dbec60b92e8a" Mar 13 11:48:45 crc kubenswrapper[4837]: E0313 11:48:45.548699 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:48:45 crc kubenswrapper[4837]: I0313 11:48:45.845164 4837 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 11:48:45 crc kubenswrapper[4837]: I0313 11:48:45.866691 4837 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 13 11:48:45 crc kubenswrapper[4837]: I0313 11:48:45.980494 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:46 crc kubenswrapper[4837]: I0313 11:48:46.377613 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:48:46 crc kubenswrapper[4837]: I0313 11:48:46.377882 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:46 crc kubenswrapper[4837]: I0313 11:48:46.380792 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:46 crc kubenswrapper[4837]: I0313 11:48:46.380837 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:46 crc kubenswrapper[4837]: I0313 11:48:46.380857 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:46 crc kubenswrapper[4837]: I0313 11:48:46.382164 4837 scope.go:117] "RemoveContainer" containerID="567274bd739faf34091c836a1d1ba1184d2ed741ed419f592fc7dbec60b92e8a" Mar 13 11:48:46 crc kubenswrapper[4837]: E0313 11:48:46.382526 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:48:46 crc kubenswrapper[4837]: I0313 11:48:46.978799 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:47 crc kubenswrapper[4837]: I0313 11:48:47.312989 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:47 crc kubenswrapper[4837]: I0313 11:48:47.315007 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:47 crc kubenswrapper[4837]: I0313 11:48:47.315238 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:47 crc kubenswrapper[4837]: I0313 11:48:47.315421 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:47 crc kubenswrapper[4837]: I0313 11:48:47.315622 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:48:47 crc kubenswrapper[4837]: E0313 11:48:47.320882 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 11:48:47 crc kubenswrapper[4837]: E0313 11:48:47.320993 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 11:48:47 crc kubenswrapper[4837]: I0313 11:48:47.981343 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:48 crc kubenswrapper[4837]: W0313 11:48:48.114537 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 13 11:48:48 crc kubenswrapper[4837]: E0313 11:48:48.114623 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 13 11:48:48 crc kubenswrapper[4837]: I0313 11:48:48.980458 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:50 crc kubenswrapper[4837]: I0313 11:48:49.978054 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:50 crc kubenswrapper[4837]: I0313 11:48:50.488507 4837 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:48:50 crc kubenswrapper[4837]: I0313 11:48:50.488623 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 11:48:50 crc kubenswrapper[4837]: E0313 11:48:50.498278 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c642b51b490f6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 11:48:50 crc kubenswrapper[4837]: &Event{ObjectMeta:{kube-controller-manager-crc.189c642b51b490f6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 13 11:48:50 crc kubenswrapper[4837]: body: Mar 13 11:48:50 crc kubenswrapper[4837]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:10.494292214 +0000 UTC m=+6.132559017,LastTimestamp:2026-03-13 11:48:50.48858393 +0000 UTC m=+46.126850723,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 11:48:50 crc kubenswrapper[4837]: > Mar 13 11:48:50 crc kubenswrapper[4837]: I0313 11:48:50.982200 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:51 crc kubenswrapper[4837]: I0313 11:48:51.981122 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:52 crc kubenswrapper[4837]: W0313 11:48:52.425492 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 13 11:48:52 crc kubenswrapper[4837]: E0313 11:48:52.425566 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 13 11:48:52 crc kubenswrapper[4837]: I0313 11:48:52.983114 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:53 crc kubenswrapper[4837]: I0313 11:48:53.299590 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 11:48:53 crc kubenswrapper[4837]: I0313 11:48:53.299797 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:53 crc kubenswrapper[4837]: I0313 11:48:53.301224 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:53 crc kubenswrapper[4837]: I0313 11:48:53.301280 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:53 crc kubenswrapper[4837]: I0313 11:48:53.301297 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:53 crc kubenswrapper[4837]: I0313 11:48:53.977427 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:54 crc kubenswrapper[4837]: I0313 11:48:54.322085 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:54 crc kubenswrapper[4837]: I0313 11:48:54.323957 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:54 crc kubenswrapper[4837]: I0313 11:48:54.324056 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:54 crc kubenswrapper[4837]: I0313 11:48:54.324080 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:54 crc kubenswrapper[4837]: I0313 11:48:54.324138 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:48:54 crc kubenswrapper[4837]: E0313 11:48:54.329994 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 11:48:54 crc kubenswrapper[4837]: E0313 11:48:54.329838 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 11:48:54 crc kubenswrapper[4837]: I0313 11:48:54.981114 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:55 crc kubenswrapper[4837]: E0313 11:48:55.133355 4837 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 11:48:55 crc kubenswrapper[4837]: I0313 11:48:55.980193 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:56 crc kubenswrapper[4837]: I0313 11:48:56.979511 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:57 crc kubenswrapper[4837]: I0313 11:48:57.981026 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:58 crc kubenswrapper[4837]: I0313 11:48:58.982121 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:59 crc kubenswrapper[4837]: I0313 11:48:59.047945 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:59 crc kubenswrapper[4837]: I0313 11:48:59.049993 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:59 crc kubenswrapper[4837]: I0313 11:48:59.050096 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:59 crc kubenswrapper[4837]: I0313 11:48:59.050115 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:59 crc kubenswrapper[4837]: I0313 11:48:59.051070 4837 scope.go:117] "RemoveContainer" containerID="567274bd739faf34091c836a1d1ba1184d2ed741ed419f592fc7dbec60b92e8a" Mar 13 11:48:59 crc kubenswrapper[4837]: E0313 11:48:59.051281 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:48:59 crc kubenswrapper[4837]: I0313 11:48:59.983410 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:49:00 crc kubenswrapper[4837]: I0313 11:49:00.488900 4837 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:49:00 crc kubenswrapper[4837]: I0313 11:49:00.489038 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:49:00 crc kubenswrapper[4837]: I0313 11:49:00.489150 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:49:00 crc kubenswrapper[4837]: I0313 11:49:00.489491 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:00 crc kubenswrapper[4837]: I0313 11:49:00.491310 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:00 crc kubenswrapper[4837]: I0313 11:49:00.491370 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:00 crc kubenswrapper[4837]: I0313 11:49:00.491384 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:00 crc kubenswrapper[4837]: I0313 11:49:00.492448 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"6f6e6211c6e06af773a58005617cdc56edcb5787a72302dacb1aa7602572beb8"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 13 11:49:00 crc kubenswrapper[4837]: I0313 11:49:00.492664 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://6f6e6211c6e06af773a58005617cdc56edcb5787a72302dacb1aa7602572beb8" gracePeriod=30 Mar 13 11:49:00 crc kubenswrapper[4837]: I0313 11:49:00.980113 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.321074 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.323508 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.324053 4837 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="6f6e6211c6e06af773a58005617cdc56edcb5787a72302dacb1aa7602572beb8" exitCode=255 Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.324111 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"6f6e6211c6e06af773a58005617cdc56edcb5787a72302dacb1aa7602572beb8"} Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.324192 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b0cb44c62a16dac6c4ffe8a78228279de3d95df063c3450e21ba1bd7d3d27f29"} Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.324230 4837 scope.go:117] "RemoveContainer" containerID="f78690d91eabf6f2c116b2e2bea9989a42acaeeef513ed5a6050a251c3d03066" Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.324325 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.325291 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.325332 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.325347 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.330906 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.332924 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.333060 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.333104 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.333151 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:49:01 crc kubenswrapper[4837]: E0313 11:49:01.337311 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 11:49:01 crc kubenswrapper[4837]: E0313 11:49:01.338294 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.979287 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:49:02 crc kubenswrapper[4837]: I0313 11:49:02.328496 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 11:49:02 crc kubenswrapper[4837]: I0313 11:49:02.329840 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:02 crc kubenswrapper[4837]: I0313 11:49:02.330844 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:02 crc kubenswrapper[4837]: I0313 11:49:02.330890 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:02 crc kubenswrapper[4837]: I0313 11:49:02.330900 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:02 crc kubenswrapper[4837]: I0313 11:49:02.979282 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:49:03 crc kubenswrapper[4837]: I0313 11:49:03.978178 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:49:04 crc kubenswrapper[4837]: I0313 11:49:04.978596 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:49:05 crc kubenswrapper[4837]: E0313 11:49:05.133765 4837 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 11:49:05 crc kubenswrapper[4837]: I0313 11:49:05.977471 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:49:06 crc kubenswrapper[4837]: I0313 11:49:06.978074 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:49:07 crc kubenswrapper[4837]: I0313 11:49:07.487289 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:49:07 crc kubenswrapper[4837]: I0313 11:49:07.487786 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:07 crc kubenswrapper[4837]: I0313 11:49:07.489205 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:07 crc kubenswrapper[4837]: I0313 11:49:07.489225 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:07 crc kubenswrapper[4837]: I0313 11:49:07.489235 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:07 crc kubenswrapper[4837]: I0313 11:49:07.492937 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:49:07 crc kubenswrapper[4837]: I0313 11:49:07.733200 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:49:07 crc kubenswrapper[4837]: I0313 11:49:07.978383 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:49:08 crc kubenswrapper[4837]: I0313 11:49:08.338403 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:08 crc kubenswrapper[4837]: I0313 11:49:08.339605 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:08 crc kubenswrapper[4837]: I0313 11:49:08.339723 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:08 crc kubenswrapper[4837]: I0313 11:49:08.339735 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:08 crc kubenswrapper[4837]: I0313 11:49:08.339787 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:49:08 crc kubenswrapper[4837]: E0313 11:49:08.344886 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 11:49:08 crc kubenswrapper[4837]: I0313 11:49:08.345901 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:08 crc kubenswrapper[4837]: I0313 11:49:08.346778 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:08 crc kubenswrapper[4837]: I0313 11:49:08.346836 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:08 crc kubenswrapper[4837]: I0313 11:49:08.346849 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:08 crc kubenswrapper[4837]: E0313 11:49:08.348982 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 11:49:08 crc kubenswrapper[4837]: I0313 11:49:08.979697 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:49:09 crc kubenswrapper[4837]: I0313 11:49:09.348862 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:09 crc kubenswrapper[4837]: I0313 11:49:09.350122 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:09 crc kubenswrapper[4837]: I0313 11:49:09.350175 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:09 crc kubenswrapper[4837]: I0313 11:49:09.350189 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:09 crc kubenswrapper[4837]: I0313 11:49:09.432170 4837 csr.go:261] certificate signing request csr-f4sll is approved, waiting to be issued Mar 13 11:49:09 crc kubenswrapper[4837]: I0313 11:49:09.441846 4837 csr.go:257] certificate signing request csr-f4sll is issued Mar 13 11:49:09 crc kubenswrapper[4837]: I0313 11:49:09.536974 4837 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 13 11:49:09 crc kubenswrapper[4837]: I0313 11:49:09.839180 4837 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.048249 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.049565 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.049612 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.049626 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.050272 4837 scope.go:117] "RemoveContainer" containerID="567274bd739faf34091c836a1d1ba1184d2ed741ed419f592fc7dbec60b92e8a" Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.354340 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.356997 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8"} Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.357167 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.358151 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.358197 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.358207 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.443359 4837 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-02 11:12:52.35671313 +0000 UTC Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.443425 4837 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7079h23m41.913290594s for next certificate rotation Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.047276 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.050484 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.050562 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.050580 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.349051 4837 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.360568 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.360941 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.362694 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8" exitCode=255 Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.362760 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8"} Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.362809 4837 scope.go:117] "RemoveContainer" containerID="567274bd739faf34091c836a1d1ba1184d2ed741ed419f592fc7dbec60b92e8a" Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.362996 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.363962 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.363985 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.363995 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.364585 4837 scope.go:117] "RemoveContainer" containerID="6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8" Mar 13 11:49:11 crc kubenswrapper[4837]: E0313 11:49:11.364777 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:49:12 crc kubenswrapper[4837]: I0313 11:49:12.367730 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.134877 4837 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.345359 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.346803 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.346853 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.346866 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.347009 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.355772 4837 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.356106 4837 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.356132 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.360688 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.360727 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.360746 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.360768 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.360814 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:15Z","lastTransitionTime":"2026-03-13T11:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.376357 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.382626 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.382681 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.382703 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.382731 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.382744 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:15Z","lastTransitionTime":"2026-03-13T11:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.399792 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.404768 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.404808 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.404821 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.404839 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.404849 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:15Z","lastTransitionTime":"2026-03-13T11:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.414211 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.418489 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.418572 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.418586 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.418612 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.418627 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:15Z","lastTransitionTime":"2026-03-13T11:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.429669 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.429893 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.429944 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.530442 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.544668 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.544893 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.546224 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.546257 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.546266 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.547068 4837 scope.go:117] "RemoveContainer" containerID="6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8" Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.547242 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.630954 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.731805 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.832419 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.933272 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:16 crc kubenswrapper[4837]: E0313 11:49:16.034284 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:16 crc kubenswrapper[4837]: E0313 11:49:16.134470 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:16 crc kubenswrapper[4837]: E0313 11:49:16.235657 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:16 crc kubenswrapper[4837]: E0313 11:49:16.335788 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:16 crc kubenswrapper[4837]: I0313 11:49:16.377590 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:49:16 crc kubenswrapper[4837]: I0313 11:49:16.380054 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:16 crc kubenswrapper[4837]: I0313 11:49:16.381264 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:16 crc kubenswrapper[4837]: I0313 11:49:16.381299 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:16 crc kubenswrapper[4837]: I0313 11:49:16.381312 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:16 crc kubenswrapper[4837]: I0313 11:49:16.382251 4837 scope.go:117] "RemoveContainer" containerID="6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8" Mar 13 11:49:16 crc kubenswrapper[4837]: E0313 11:49:16.382572 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:49:16 crc kubenswrapper[4837]: E0313 11:49:16.435902 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:16 crc kubenswrapper[4837]: E0313 11:49:16.536547 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:16 crc kubenswrapper[4837]: E0313 11:49:16.636753 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:16 crc kubenswrapper[4837]: E0313 11:49:16.737794 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:16 crc kubenswrapper[4837]: E0313 11:49:16.838348 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:16 crc kubenswrapper[4837]: E0313 11:49:16.939314 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:17 crc kubenswrapper[4837]: E0313 11:49:17.040191 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:17 crc kubenswrapper[4837]: E0313 11:49:17.141184 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:17 crc kubenswrapper[4837]: E0313 11:49:17.242230 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:17 crc kubenswrapper[4837]: E0313 11:49:17.343344 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:17 crc kubenswrapper[4837]: E0313 11:49:17.443806 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:17 crc kubenswrapper[4837]: E0313 11:49:17.545090 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:17 crc kubenswrapper[4837]: E0313 11:49:17.646273 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:17 crc kubenswrapper[4837]: I0313 11:49:17.737499 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:49:17 crc kubenswrapper[4837]: I0313 11:49:17.737796 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:17 crc kubenswrapper[4837]: I0313 11:49:17.739448 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:17 crc kubenswrapper[4837]: I0313 11:49:17.739496 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:17 crc kubenswrapper[4837]: I0313 11:49:17.739506 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:17 crc kubenswrapper[4837]: E0313 11:49:17.746691 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:17 crc kubenswrapper[4837]: E0313 11:49:17.847833 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:17 crc kubenswrapper[4837]: E0313 11:49:17.948026 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:18 crc kubenswrapper[4837]: E0313 11:49:18.048953 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:18 crc kubenswrapper[4837]: E0313 11:49:18.149432 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:18 crc kubenswrapper[4837]: E0313 11:49:18.249832 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:18 crc kubenswrapper[4837]: E0313 11:49:18.350659 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:18 crc kubenswrapper[4837]: E0313 11:49:18.451827 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:18 crc kubenswrapper[4837]: E0313 11:49:18.552268 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:18 crc kubenswrapper[4837]: E0313 11:49:18.652490 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:18 crc kubenswrapper[4837]: E0313 11:49:18.753587 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:18 crc kubenswrapper[4837]: E0313 11:49:18.853902 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:18 crc kubenswrapper[4837]: E0313 11:49:18.954020 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:19 crc kubenswrapper[4837]: E0313 11:49:19.054358 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:19 crc kubenswrapper[4837]: E0313 11:49:19.155528 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:19 crc kubenswrapper[4837]: E0313 11:49:19.256136 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:19 crc kubenswrapper[4837]: E0313 11:49:19.357082 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:19 crc kubenswrapper[4837]: E0313 11:49:19.457307 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:19 crc kubenswrapper[4837]: E0313 11:49:19.557511 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:19 crc kubenswrapper[4837]: E0313 11:49:19.658059 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:19 crc kubenswrapper[4837]: E0313 11:49:19.758885 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:19 crc kubenswrapper[4837]: E0313 11:49:19.859673 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:19 crc kubenswrapper[4837]: E0313 11:49:19.959800 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:20 crc kubenswrapper[4837]: E0313 11:49:20.060564 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:20 crc kubenswrapper[4837]: E0313 11:49:20.160909 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:20 crc kubenswrapper[4837]: E0313 11:49:20.261263 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:20 crc kubenswrapper[4837]: E0313 11:49:20.362457 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:20 crc kubenswrapper[4837]: E0313 11:49:20.462923 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:20 crc kubenswrapper[4837]: E0313 11:49:20.563297 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:20 crc kubenswrapper[4837]: E0313 11:49:20.664305 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:20 crc kubenswrapper[4837]: E0313 11:49:20.764877 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:20 crc kubenswrapper[4837]: E0313 11:49:20.865062 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:20 crc kubenswrapper[4837]: E0313 11:49:20.965697 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:21 crc kubenswrapper[4837]: E0313 11:49:21.066711 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:21 crc kubenswrapper[4837]: E0313 11:49:21.167473 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:21 crc kubenswrapper[4837]: E0313 11:49:21.267832 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:21 crc kubenswrapper[4837]: E0313 11:49:21.368128 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:21 crc kubenswrapper[4837]: E0313 11:49:21.468725 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:21 crc kubenswrapper[4837]: E0313 11:49:21.569049 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:21 crc kubenswrapper[4837]: E0313 11:49:21.669412 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:21 crc kubenswrapper[4837]: E0313 11:49:21.770055 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:21 crc kubenswrapper[4837]: E0313 11:49:21.870571 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:21 crc kubenswrapper[4837]: E0313 11:49:21.971138 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:22 crc kubenswrapper[4837]: E0313 11:49:22.072405 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:22 crc kubenswrapper[4837]: E0313 11:49:22.172773 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:22 crc kubenswrapper[4837]: E0313 11:49:22.273259 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:22 crc kubenswrapper[4837]: E0313 11:49:22.374013 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:22 crc kubenswrapper[4837]: E0313 11:49:22.475061 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:22 crc kubenswrapper[4837]: E0313 11:49:22.590057 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:22 crc kubenswrapper[4837]: E0313 11:49:22.690410 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:22 crc kubenswrapper[4837]: E0313 11:49:22.791223 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:22 crc kubenswrapper[4837]: E0313 11:49:22.892120 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:22 crc kubenswrapper[4837]: E0313 11:49:22.992690 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:23 crc kubenswrapper[4837]: E0313 11:49:23.093539 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:23 crc kubenswrapper[4837]: E0313 11:49:23.194467 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:23 crc kubenswrapper[4837]: E0313 11:49:23.294568 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:23 crc kubenswrapper[4837]: E0313 11:49:23.395446 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:23 crc kubenswrapper[4837]: E0313 11:49:23.496285 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:23 crc kubenswrapper[4837]: E0313 11:49:23.597112 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:23 crc kubenswrapper[4837]: E0313 11:49:23.697455 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:23 crc kubenswrapper[4837]: E0313 11:49:23.797901 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:23 crc kubenswrapper[4837]: E0313 11:49:23.898770 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:24 crc kubenswrapper[4837]: E0313 11:49:23.999971 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:24 crc kubenswrapper[4837]: E0313 11:49:24.100902 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:24 crc kubenswrapper[4837]: I0313 11:49:24.161090 4837 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 11:49:24 crc kubenswrapper[4837]: E0313 11:49:24.201898 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:24 crc kubenswrapper[4837]: E0313 11:49:24.302484 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:24 crc kubenswrapper[4837]: E0313 11:49:24.403849 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:24 crc kubenswrapper[4837]: E0313 11:49:24.504363 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:24 crc kubenswrapper[4837]: E0313 11:49:24.605100 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:24 crc kubenswrapper[4837]: E0313 11:49:24.705393 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:24 crc kubenswrapper[4837]: E0313 11:49:24.806329 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:24 crc kubenswrapper[4837]: E0313 11:49:24.907327 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.007465 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.108044 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.135359 4837 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.208540 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.309322 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.410101 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.511206 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.561601 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.566768 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.566816 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.566833 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.566858 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.566879 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:25Z","lastTransitionTime":"2026-03-13T11:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.585482 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.590556 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.590594 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.590606 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.590627 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.590647 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:25Z","lastTransitionTime":"2026-03-13T11:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.605149 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.610464 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.610497 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.610509 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.610526 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.610538 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:25Z","lastTransitionTime":"2026-03-13T11:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.622576 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.627595 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.627653 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.627667 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.627687 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.627699 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:25Z","lastTransitionTime":"2026-03-13T11:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.639427 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.639597 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.639628 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.739821 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.840902 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.941138 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:26 crc kubenswrapper[4837]: E0313 11:49:26.042199 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:26 crc kubenswrapper[4837]: E0313 11:49:26.142966 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:26 crc kubenswrapper[4837]: E0313 11:49:26.243280 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:26 crc kubenswrapper[4837]: E0313 11:49:26.344522 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:26 crc kubenswrapper[4837]: E0313 11:49:26.445387 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:26 crc kubenswrapper[4837]: E0313 11:49:26.546560 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:26 crc kubenswrapper[4837]: E0313 11:49:26.647528 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:26 crc kubenswrapper[4837]: E0313 11:49:26.748373 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:26 crc kubenswrapper[4837]: E0313 11:49:26.849106 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:26 crc kubenswrapper[4837]: E0313 11:49:26.950572 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:27 crc kubenswrapper[4837]: I0313 11:49:27.047896 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:27 crc kubenswrapper[4837]: I0313 11:49:27.049141 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:27 crc kubenswrapper[4837]: I0313 11:49:27.049213 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:27 crc kubenswrapper[4837]: I0313 11:49:27.049231 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:27 crc kubenswrapper[4837]: I0313 11:49:27.050439 4837 scope.go:117] "RemoveContainer" containerID="6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8" Mar 13 11:49:27 crc kubenswrapper[4837]: E0313 11:49:27.050792 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:27 crc kubenswrapper[4837]: E0313 11:49:27.050792 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:49:27 crc kubenswrapper[4837]: E0313 11:49:27.151998 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:27 crc kubenswrapper[4837]: E0313 11:49:27.252267 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:27 crc kubenswrapper[4837]: E0313 11:49:27.352498 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:27 crc kubenswrapper[4837]: E0313 11:49:27.453734 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:27 crc kubenswrapper[4837]: E0313 11:49:27.558600 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:27 crc kubenswrapper[4837]: E0313 11:49:27.658733 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:27 crc kubenswrapper[4837]: E0313 11:49:27.759188 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:27 crc kubenswrapper[4837]: E0313 11:49:27.859958 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:27 crc kubenswrapper[4837]: E0313 11:49:27.960172 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:28 crc kubenswrapper[4837]: E0313 11:49:28.060463 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:28 crc kubenswrapper[4837]: E0313 11:49:28.161227 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:28 crc kubenswrapper[4837]: E0313 11:49:28.261427 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:28 crc kubenswrapper[4837]: E0313 11:49:28.361978 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:28 crc kubenswrapper[4837]: E0313 11:49:28.462574 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:28 crc kubenswrapper[4837]: E0313 11:49:28.563708 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:28 crc kubenswrapper[4837]: E0313 11:49:28.663976 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:28 crc kubenswrapper[4837]: E0313 11:49:28.764787 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:28 crc kubenswrapper[4837]: E0313 11:49:28.864928 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:28 crc kubenswrapper[4837]: E0313 11:49:28.965447 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:29 crc kubenswrapper[4837]: E0313 11:49:29.065720 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:29 crc kubenswrapper[4837]: E0313 11:49:29.166934 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:29 crc kubenswrapper[4837]: E0313 11:49:29.267451 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:29 crc kubenswrapper[4837]: E0313 11:49:29.368154 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:29 crc kubenswrapper[4837]: E0313 11:49:29.468693 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:29 crc kubenswrapper[4837]: E0313 11:49:29.569115 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:29 crc kubenswrapper[4837]: E0313 11:49:29.669544 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:29 crc kubenswrapper[4837]: E0313 11:49:29.770609 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:29 crc kubenswrapper[4837]: E0313 11:49:29.871367 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:29 crc kubenswrapper[4837]: E0313 11:49:29.972008 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:30 crc kubenswrapper[4837]: E0313 11:49:30.072327 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:30 crc kubenswrapper[4837]: E0313 11:49:30.172454 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:30 crc kubenswrapper[4837]: E0313 11:49:30.273025 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:30 crc kubenswrapper[4837]: E0313 11:49:30.373252 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:30 crc kubenswrapper[4837]: E0313 11:49:30.473799 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:30 crc kubenswrapper[4837]: E0313 11:49:30.574398 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:30 crc kubenswrapper[4837]: E0313 11:49:30.675145 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:30 crc kubenswrapper[4837]: E0313 11:49:30.776162 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:30 crc kubenswrapper[4837]: E0313 11:49:30.876712 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:30 crc kubenswrapper[4837]: E0313 11:49:30.977099 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:31 crc kubenswrapper[4837]: E0313 11:49:31.077491 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:31 crc kubenswrapper[4837]: E0313 11:49:31.178578 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:31 crc kubenswrapper[4837]: E0313 11:49:31.279239 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:31 crc kubenswrapper[4837]: E0313 11:49:31.379729 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:31 crc kubenswrapper[4837]: E0313 11:49:31.480347 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:31 crc kubenswrapper[4837]: E0313 11:49:31.581541 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:31 crc kubenswrapper[4837]: E0313 11:49:31.682371 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:31 crc kubenswrapper[4837]: E0313 11:49:31.782558 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:31 crc kubenswrapper[4837]: E0313 11:49:31.883702 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:31 crc kubenswrapper[4837]: E0313 11:49:31.984541 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:32 crc kubenswrapper[4837]: E0313 11:49:32.084942 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:32 crc kubenswrapper[4837]: E0313 11:49:32.186415 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:32 crc kubenswrapper[4837]: E0313 11:49:32.286747 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:32 crc kubenswrapper[4837]: E0313 11:49:32.387325 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:32 crc kubenswrapper[4837]: E0313 11:49:32.488301 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:32 crc kubenswrapper[4837]: E0313 11:49:32.589523 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:32 crc kubenswrapper[4837]: E0313 11:49:32.690290 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:32 crc kubenswrapper[4837]: E0313 11:49:32.791204 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:32 crc kubenswrapper[4837]: E0313 11:49:32.891772 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:32 crc kubenswrapper[4837]: E0313 11:49:32.992655 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:33 crc kubenswrapper[4837]: E0313 11:49:33.093242 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:33 crc kubenswrapper[4837]: E0313 11:49:33.194197 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:33 crc kubenswrapper[4837]: E0313 11:49:33.294760 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:33 crc kubenswrapper[4837]: E0313 11:49:33.395274 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:33 crc kubenswrapper[4837]: E0313 11:49:33.495901 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:33 crc kubenswrapper[4837]: E0313 11:49:33.596560 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:33 crc kubenswrapper[4837]: E0313 11:49:33.697457 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:33 crc kubenswrapper[4837]: E0313 11:49:33.797849 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:33 crc kubenswrapper[4837]: E0313 11:49:33.898996 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:33 crc kubenswrapper[4837]: E0313 11:49:33.999151 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:34 crc kubenswrapper[4837]: E0313 11:49:34.099728 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:34 crc kubenswrapper[4837]: E0313 11:49:34.200866 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:34 crc kubenswrapper[4837]: E0313 11:49:34.302151 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:34 crc kubenswrapper[4837]: E0313 11:49:34.403504 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:34 crc kubenswrapper[4837]: E0313 11:49:34.504575 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:34 crc kubenswrapper[4837]: E0313 11:49:34.605178 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:34 crc kubenswrapper[4837]: E0313 11:49:34.705511 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:34 crc kubenswrapper[4837]: E0313 11:49:34.806665 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:34 crc kubenswrapper[4837]: E0313 11:49:34.907189 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.008055 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.013048 4837 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.088424 4837 apiserver.go:52] "Watching apiserver" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.095472 4837 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.096288 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-cjn4q","openshift-ovn-kubernetes/ovnkube-node-4zzrs","openshift-image-registry/node-ca-np68d","openshift-machine-config-operator/machine-config-daemon-2td4d","openshift-dns/node-resolver-xwmn9","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl","openshift-multus/multus-qg957","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-multus/multus-additional-cni-plugins-xkqn6","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.096802 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.097029 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.097128 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.097225 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.097223 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.097437 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.098018 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.098330 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.098389 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.098653 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.098682 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-np68d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.098902 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xwmn9" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.099053 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.099498 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.099699 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.099810 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.099504 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.100136 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.100504 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.105395 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.105636 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.105738 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.106061 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.105849 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.106221 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.106350 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.106392 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.106565 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.106577 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.106593 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.106620 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.107011 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.107125 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.107078 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.107420 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.107845 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.108027 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.108147 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.108308 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.108420 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.108543 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.108767 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.108936 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.108419 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.109258 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.108106 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.109537 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.109351 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.109818 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.109957 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.110158 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.110287 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.110470 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.110614 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.108031 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.116911 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.116977 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.116991 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.117015 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.117036 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:35Z","lastTransitionTime":"2026-03-13T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.130306 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.143641 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.156422 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.168818 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.179491 4837 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.181011 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.189663 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.200539 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205063 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205133 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205171 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205223 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205253 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205301 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205330 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205379 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205407 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205433 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205432 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205481 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205506 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205553 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205578 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205627 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205560 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205810 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206376 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206450 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206564 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206455 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206434 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206682 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206530 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206729 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206792 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206803 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206807 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206820 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206897 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206951 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207042 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207095 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207122 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207144 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207158 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207185 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207206 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207227 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207390 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207413 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207452 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207476 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207498 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207594 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207602 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207623 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207682 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207710 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207813 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207847 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207896 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207921 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207966 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207989 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208014 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208131 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208187 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208216 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208263 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208344 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208381 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208427 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208456 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208475 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208514 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208536 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208557 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208591 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208614 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208657 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208682 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208702 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208740 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208794 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208952 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208973 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209015 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209037 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209058 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209092 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209111 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209129 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209214 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209239 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209258 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209294 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209314 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209338 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209381 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209404 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209424 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209460 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209480 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209497 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209531 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209550 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209569 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209602 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209748 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209771 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209789 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209833 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209851 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209873 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209910 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209931 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209953 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211012 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211037 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211075 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211096 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211114 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211161 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211189 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211237 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211260 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211277 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207857 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211313 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207908 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211334 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208018 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208191 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211355 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211393 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211414 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211436 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211481 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211509 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211555 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211575 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211597 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211628 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211671 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211716 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211742 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211768 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.213408 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208187 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208219 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208598 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208828 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208877 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208814 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209202 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209221 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209218 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209571 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209687 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209738 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209973 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.210334 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.210503 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.210796 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.210817 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.210880 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.210897 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.210959 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211678 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211707 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.211817 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:49:35.711795341 +0000 UTC m=+91.350062094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.214900 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.214942 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.214966 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.214991 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215016 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215038 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215061 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215079 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215097 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215116 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215133 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215144 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215151 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215183 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215421 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215446 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215849 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215880 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.216074 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.216179 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.216841 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.217217 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.217242 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.217367 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.217260 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.217470 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.217955 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.218180 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.218367 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.218402 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.218577 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.218628 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.218716 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.218745 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.219159 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.219535 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.219560 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.219582 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.219612 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.219656 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.212114 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.212491 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.212730 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.212623 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.219815 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.219845 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.220074 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.220102 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.220754 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.221296 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.221543 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.221584 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.221966 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222079 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222108 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222134 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222223 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222388 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222395 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222424 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222461 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222483 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222497 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:35Z","lastTransitionTime":"2026-03-13T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222422 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222903 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222929 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222953 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222977 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223002 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223029 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223059 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223088 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223116 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223139 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223165 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223186 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223211 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223236 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223445 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223477 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223502 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.212992 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.213024 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.213111 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.213423 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.213625 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.213653 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.213699 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223760 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.213923 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.213971 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.213999 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.214064 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.214084 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.214199 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223378 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223676 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.224296 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.224319 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.224326 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.224377 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.224469 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.224495 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.224514 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.224685 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225121 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225185 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225253 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225322 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225258 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225479 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225537 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225595 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.226214 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225685 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225901 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225949 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.226282 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225846 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225957 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227548 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-run-multus-certs\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227579 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-kubelet\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227600 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdh8r\" (UniqueName: \"kubernetes.io/projected/4c126c88-4541-474c-bc1f-5ca9befa3146-kube-api-access-wdh8r\") pod \"node-ca-np68d\" (UID: \"4c126c88-4541-474c-bc1f-5ca9befa3146\") " pod="openshift-image-registry/node-ca-np68d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227662 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227713 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-cni-binary-copy\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227737 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-run-k8s-cni-cncf-io\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227758 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-cnibin\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227775 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-var-lib-kubelet\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227804 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227833 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/501b48f2-bba8-44d4-81df-7a8b7df456b5-system-cni-dir\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227884 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/501b48f2-bba8-44d4-81df-7a8b7df456b5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227936 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/338e0d25-c97d-42ec-a8ec-51ddf77a5ed8-rootfs\") pod \"machine-config-daemon-2td4d\" (UID: \"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\") " pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227978 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-multus-cni-dir\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.228003 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.228213 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-ovnkube-script-lib\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.228473 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.228506 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.228540 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.228717 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.228749 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.228416 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-hostroot\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.229721 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-run-ovn-kubernetes\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.229804 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-cni-bin\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.229463 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.229505 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.229702 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.229733 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.229867 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.230157 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.230187 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.230238 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.230536 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.230572 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.230609 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85hll\" (UniqueName: \"kubernetes.io/projected/43df29f7-1351-41f5-bfca-17f804837cb4-kube-api-access-85hll\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.230706 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.230896 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/501b48f2-bba8-44d4-81df-7a8b7df456b5-cnibin\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231001 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvtx6\" (UniqueName: \"kubernetes.io/projected/338e0d25-c97d-42ec-a8ec-51ddf77a5ed8-kube-api-access-cvtx6\") pod \"machine-config-daemon-2td4d\" (UID: \"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\") " pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231020 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-systemd-units\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231036 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-log-socket\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231056 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmhlq\" (UniqueName: \"kubernetes.io/projected/501b48f2-bba8-44d4-81df-7a8b7df456b5-kube-api-access-pmhlq\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231087 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231103 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-env-overrides\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231208 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4c126c88-4541-474c-bc1f-5ca9befa3146-serviceca\") pod \"node-ca-np68d\" (UID: \"4c126c88-4541-474c-bc1f-5ca9befa3146\") " pod="openshift-image-registry/node-ca-np68d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231232 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231251 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/338e0d25-c97d-42ec-a8ec-51ddf77a5ed8-proxy-tls\") pod \"machine-config-daemon-2td4d\" (UID: \"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\") " pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231268 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e05c56f7-b007-4165-9e29-98cfa865d020-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-dt7fl\" (UID: \"e05c56f7-b007-4165-9e29-98cfa865d020\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231284 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e05c56f7-b007-4165-9e29-98cfa865d020-env-overrides\") pod \"ovnkube-control-plane-749d76644c-dt7fl\" (UID: \"e05c56f7-b007-4165-9e29-98cfa865d020\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231300 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-system-cni-dir\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231316 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-os-release\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231337 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231353 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-run-netns\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231375 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231395 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231415 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c126c88-4541-474c-bc1f-5ca9befa3146-host\") pod \"node-ca-np68d\" (UID: \"4c126c88-4541-474c-bc1f-5ca9befa3146\") " pod="openshift-image-registry/node-ca-np68d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231427 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231435 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-ovn\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231517 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231534 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231559 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231578 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nj56\" (UniqueName: \"kubernetes.io/projected/86e5afeb-4720-4593-a53e-dfb5381d0b1d-kube-api-access-6nj56\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231596 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/338e0d25-c97d-42ec-a8ec-51ddf77a5ed8-mcd-auth-proxy-config\") pod \"machine-config-daemon-2td4d\" (UID: \"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\") " pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231612 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-multus-conf-dir\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231629 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-multus-daemon-config\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231671 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231690 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-run-netns\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231709 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231703 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231729 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f6398583-f9ff-4b10-829a-503fd523710b-hosts-file\") pod \"node-resolver-xwmn9\" (UID: \"f6398583-f9ff-4b10-829a-503fd523710b\") " pod="openshift-dns/node-resolver-xwmn9" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231903 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-cni-netd\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231951 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/501b48f2-bba8-44d4-81df-7a8b7df456b5-os-release\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231988 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9f5g\" (UniqueName: \"kubernetes.io/projected/e05c56f7-b007-4165-9e29-98cfa865d020-kube-api-access-r9f5g\") pod \"ovnkube-control-plane-749d76644c-dt7fl\" (UID: \"e05c56f7-b007-4165-9e29-98cfa865d020\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232018 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fqxj\" (UniqueName: \"kubernetes.io/projected/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-kube-api-access-2fqxj\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232098 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/501b48f2-bba8-44d4-81df-7a8b7df456b5-cni-binary-copy\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232130 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-etc-kubernetes\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232141 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232162 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43df29f7-1351-41f5-bfca-17f804837cb4-ovn-node-metrics-cert\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232195 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-var-lib-openvswitch\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232223 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-openvswitch\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232254 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/501b48f2-bba8-44d4-81df-7a8b7df456b5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232269 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232282 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e05c56f7-b007-4165-9e29-98cfa865d020-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-dt7fl\" (UID: \"e05c56f7-b007-4165-9e29-98cfa865d020\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232321 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-var-lib-cni-bin\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232354 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-var-lib-cni-multus\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232363 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232387 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232415 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7ckv\" (UniqueName: \"kubernetes.io/projected/f6398583-f9ff-4b10-829a-503fd523710b-kube-api-access-q7ckv\") pod \"node-resolver-xwmn9\" (UID: \"f6398583-f9ff-4b10-829a-503fd523710b\") " pod="openshift-dns/node-resolver-xwmn9" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232450 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-slash\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232471 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-etc-openvswitch\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232489 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232521 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232558 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232586 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-node-log\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232609 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-multus-socket-dir-parent\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232631 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-systemd\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232662 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-ovnkube-config\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.233147 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232486 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232598 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231158 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232669 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232718 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232739 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232759 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.233096 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.233307 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:35.733278808 +0000 UTC m=+91.371545751 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.233440 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.233558 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.233754 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:35.733742942 +0000 UTC m=+91.372009925 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.233957 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.234226 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.234253 4837 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.234385 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.234418 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.234439 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.234458 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.234801 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.235589 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.235627 4837 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.235675 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.235696 4837 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.235712 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.235717 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.235727 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.235883 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236709 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236737 4837 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236756 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236772 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236788 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236803 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236816 4837 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236832 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236851 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236867 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236881 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236895 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236909 4837 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236924 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236957 4837 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236971 4837 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236985 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237000 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237014 4837 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237027 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237040 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237054 4837 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237068 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237083 4837 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237098 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237111 4837 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237124 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237137 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237151 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237164 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237177 4837 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237190 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237203 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237216 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237228 4837 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237242 4837 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237255 4837 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237267 4837 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237280 4837 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237295 4837 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237308 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237322 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237336 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237351 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237365 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237382 4837 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237396 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237409 4837 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237422 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237436 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237449 4837 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237465 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237478 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237491 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237505 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237119 4837 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237518 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237575 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237589 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237603 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237618 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237637 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237665 4837 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237678 4837 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237691 4837 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238405 4837 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238421 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238435 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238450 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238464 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238479 4837 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238493 4837 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238506 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238519 4837 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238532 4837 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238545 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238558 4837 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238574 4837 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238592 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238605 4837 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238618 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238630 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238683 4837 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238695 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238708 4837 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238720 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238733 4837 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238746 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238760 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238774 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.245603 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.246145 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.248064 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.248420 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.248670 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.248903 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236040 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.249314 4837 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.249486 4837 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.249316 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.249589 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.249729 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.249919 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.250031 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.250208 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.250126 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.250234 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.250611 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.250655 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.250670 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.250752 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:35.750727908 +0000 UTC m=+91.388994671 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.251283 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.251321 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.251341 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.251425 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:35.751400419 +0000 UTC m=+91.389667182 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.252775 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.256468 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.259156 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.262534 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.262765 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.263355 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.263464 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.263792 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.263866 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.264093 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.264509 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.265407 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.265434 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.265474 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.265661 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.266177 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.266354 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.267756 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.267789 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.268188 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.268468 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.268519 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.268969 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.269155 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.269270 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.269362 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.269438 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.269526 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.269721 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.270224 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.273814 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.273889 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.274265 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.274380 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.274337 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.274171 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.274395 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.274837 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.274977 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.279189 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.279189 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.279241 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.279407 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.279792 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.280452 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.280472 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.280767 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.280768 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.280856 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.280873 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.281228 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.281249 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.281259 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.281308 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.281543 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.281617 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.282673 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.283969 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.284009 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.284381 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.284427 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.284483 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.285273 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.288014 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.288247 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.288837 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.289402 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.289452 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.289429 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.289737 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.291919 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.296908 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.299490 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.304017 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.306236 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.309733 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.311982 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.320518 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.325254 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.325345 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.325411 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.325538 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.325602 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:35Z","lastTransitionTime":"2026-03-13T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.327114 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350469 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350505 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/338e0d25-c97d-42ec-a8ec-51ddf77a5ed8-proxy-tls\") pod \"machine-config-daemon-2td4d\" (UID: \"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\") " pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350522 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e05c56f7-b007-4165-9e29-98cfa865d020-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-dt7fl\" (UID: \"e05c56f7-b007-4165-9e29-98cfa865d020\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350555 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e05c56f7-b007-4165-9e29-98cfa865d020-env-overrides\") pod \"ovnkube-control-plane-749d76644c-dt7fl\" (UID: \"e05c56f7-b007-4165-9e29-98cfa865d020\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350571 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-env-overrides\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350588 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4c126c88-4541-474c-bc1f-5ca9befa3146-serviceca\") pod \"node-ca-np68d\" (UID: \"4c126c88-4541-474c-bc1f-5ca9befa3146\") " pod="openshift-image-registry/node-ca-np68d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350606 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-system-cni-dir\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350620 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-os-release\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350657 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-run-netns\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350685 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c126c88-4541-474c-bc1f-5ca9befa3146-host\") pod \"node-ca-np68d\" (UID: \"4c126c88-4541-474c-bc1f-5ca9befa3146\") " pod="openshift-image-registry/node-ca-np68d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350701 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nj56\" (UniqueName: \"kubernetes.io/projected/86e5afeb-4720-4593-a53e-dfb5381d0b1d-kube-api-access-6nj56\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350717 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/338e0d25-c97d-42ec-a8ec-51ddf77a5ed8-mcd-auth-proxy-config\") pod \"machine-config-daemon-2td4d\" (UID: \"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\") " pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350731 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-multus-conf-dir\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350745 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-ovn\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350763 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-multus-daemon-config\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350779 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-run-netns\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350804 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-cni-netd\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350833 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/501b48f2-bba8-44d4-81df-7a8b7df456b5-os-release\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350855 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9f5g\" (UniqueName: \"kubernetes.io/projected/e05c56f7-b007-4165-9e29-98cfa865d020-kube-api-access-r9f5g\") pod \"ovnkube-control-plane-749d76644c-dt7fl\" (UID: \"e05c56f7-b007-4165-9e29-98cfa865d020\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350879 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fqxj\" (UniqueName: \"kubernetes.io/projected/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-kube-api-access-2fqxj\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350898 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f6398583-f9ff-4b10-829a-503fd523710b-hosts-file\") pod \"node-resolver-xwmn9\" (UID: \"f6398583-f9ff-4b10-829a-503fd523710b\") " pod="openshift-dns/node-resolver-xwmn9" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350918 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/501b48f2-bba8-44d4-81df-7a8b7df456b5-cni-binary-copy\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350932 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-os-release\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350966 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-etc-kubernetes\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350939 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-etc-kubernetes\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351000 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-run-netns\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351005 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43df29f7-1351-41f5-bfca-17f804837cb4-ovn-node-metrics-cert\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351030 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c126c88-4541-474c-bc1f-5ca9befa3146-host\") pod \"node-ca-np68d\" (UID: \"4c126c88-4541-474c-bc1f-5ca9befa3146\") " pod="openshift-image-registry/node-ca-np68d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351034 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/501b48f2-bba8-44d4-81df-7a8b7df456b5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351060 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e05c56f7-b007-4165-9e29-98cfa865d020-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-dt7fl\" (UID: \"e05c56f7-b007-4165-9e29-98cfa865d020\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351085 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-var-lib-cni-bin\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351109 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-var-lib-cni-multus\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351132 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-var-lib-openvswitch\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351155 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-openvswitch\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351176 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7ckv\" (UniqueName: \"kubernetes.io/projected/f6398583-f9ff-4b10-829a-503fd523710b-kube-api-access-q7ckv\") pod \"node-resolver-xwmn9\" (UID: \"f6398583-f9ff-4b10-829a-503fd523710b\") " pod="openshift-dns/node-resolver-xwmn9" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351201 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-slash\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351224 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-etc-openvswitch\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351246 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351285 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-node-log\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351307 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-multus-socket-dir-parent\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351341 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-systemd\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351366 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-ovnkube-config\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351387 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdh8r\" (UniqueName: \"kubernetes.io/projected/4c126c88-4541-474c-bc1f-5ca9befa3146-kube-api-access-wdh8r\") pod \"node-ca-np68d\" (UID: \"4c126c88-4541-474c-bc1f-5ca9befa3146\") " pod="openshift-image-registry/node-ca-np68d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351409 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-cni-binary-copy\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351430 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-run-k8s-cni-cncf-io\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351453 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-run-multus-certs\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351474 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-kubelet\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351497 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/501b48f2-bba8-44d4-81df-7a8b7df456b5-system-cni-dir\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351520 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/501b48f2-bba8-44d4-81df-7a8b7df456b5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351543 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/338e0d25-c97d-42ec-a8ec-51ddf77a5ed8-rootfs\") pod \"machine-config-daemon-2td4d\" (UID: \"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\") " pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351564 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-multus-cni-dir\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351587 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-cnibin\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351609 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-var-lib-kubelet\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351631 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351674 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-ovnkube-script-lib\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351697 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-hostroot\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351721 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-run-ovn-kubernetes\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351737 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e05c56f7-b007-4165-9e29-98cfa865d020-env-overrides\") pod \"ovnkube-control-plane-749d76644c-dt7fl\" (UID: \"e05c56f7-b007-4165-9e29-98cfa865d020\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351760 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-cni-bin\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351798 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/501b48f2-bba8-44d4-81df-7a8b7df456b5-cnibin\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351824 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvtx6\" (UniqueName: \"kubernetes.io/projected/338e0d25-c97d-42ec-a8ec-51ddf77a5ed8-kube-api-access-cvtx6\") pod \"machine-config-daemon-2td4d\" (UID: \"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\") " pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351846 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-systemd-units\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.351855 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351869 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-log-socket\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351893 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85hll\" (UniqueName: \"kubernetes.io/projected/43df29f7-1351-41f5-bfca-17f804837cb4-kube-api-access-85hll\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.351926 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs podName:86e5afeb-4720-4593-a53e-dfb5381d0b1d nodeName:}" failed. No retries permitted until 2026-03-13 11:49:35.851903114 +0000 UTC m=+91.490169897 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs") pod "network-metrics-daemon-cjn4q" (UID: "86e5afeb-4720-4593-a53e-dfb5381d0b1d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351956 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352002 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmhlq\" (UniqueName: \"kubernetes.io/projected/501b48f2-bba8-44d4-81df-7a8b7df456b5-kube-api-access-pmhlq\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352085 4837 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352108 4837 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352110 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-systemd\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352127 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352146 4837 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352166 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352183 4837 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352199 4837 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352203 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/338e0d25-c97d-42ec-a8ec-51ddf77a5ed8-mcd-auth-proxy-config\") pod \"machine-config-daemon-2td4d\" (UID: \"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\") " pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352217 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352232 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352244 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-multus-conf-dir\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352248 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352272 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352285 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352298 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352310 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352323 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352336 4837 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352349 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352361 4837 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352374 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352387 4837 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352398 4837 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352413 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352426 4837 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352438 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352451 4837 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352466 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352483 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352496 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352510 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352521 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352533 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352523 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-kubelet\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352545 4837 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352564 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/501b48f2-bba8-44d4-81df-7a8b7df456b5-os-release\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352598 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352394 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-cnibin\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352660 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-env-overrides\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351903 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-multus-socket-dir-parent\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352715 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352884 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-var-lib-cni-bin\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352895 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352918 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352928 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/338e0d25-c97d-42ec-a8ec-51ddf77a5ed8-rootfs\") pod \"machine-config-daemon-2td4d\" (UID: \"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\") " pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352959 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-etc-openvswitch\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352981 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/501b48f2-bba8-44d4-81df-7a8b7df456b5-system-cni-dir\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353002 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-var-lib-cni-multus\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353089 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-systemd-units\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353134 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-log-socket\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353171 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-multus-cni-dir\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353184 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-hostroot\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353228 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-var-lib-kubelet\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353234 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-node-log\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353256 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-ovnkube-config\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353268 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-cni-netd\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353293 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-var-lib-openvswitch\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353276 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-cni-bin\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353509 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-cni-binary-copy\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353535 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-slash\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353556 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-openvswitch\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353595 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-system-cni-dir\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353940 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-run-netns\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353992 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/501b48f2-bba8-44d4-81df-7a8b7df456b5-cnibin\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354020 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-run-multus-certs\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354060 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f6398583-f9ff-4b10-829a-503fd523710b-hosts-file\") pod \"node-resolver-xwmn9\" (UID: \"f6398583-f9ff-4b10-829a-503fd523710b\") " pod="openshift-dns/node-resolver-xwmn9" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354077 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-ovn\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354099 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-run-k8s-cni-cncf-io\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354433 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e05c56f7-b007-4165-9e29-98cfa865d020-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-dt7fl\" (UID: \"e05c56f7-b007-4165-9e29-98cfa865d020\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354477 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-multus-daemon-config\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354538 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354561 4837 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354579 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354594 4837 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354610 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354627 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354668 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354682 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354697 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354714 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354730 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354745 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354761 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354774 4837 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354778 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-ovnkube-script-lib\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354789 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354910 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-run-ovn-kubernetes\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354946 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4c126c88-4541-474c-bc1f-5ca9befa3146-serviceca\") pod \"node-ca-np68d\" (UID: \"4c126c88-4541-474c-bc1f-5ca9befa3146\") " pod="openshift-image-registry/node-ca-np68d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354991 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/501b48f2-bba8-44d4-81df-7a8b7df456b5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355094 4837 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355115 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355128 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355144 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355161 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355176 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355190 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355203 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355216 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355230 4837 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355243 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355258 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355272 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355286 4837 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355299 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355312 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355332 4837 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355347 4837 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355364 4837 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355395 4837 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355415 4837 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355432 4837 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355448 4837 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355465 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355482 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355500 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355518 4837 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355537 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355556 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355572 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355591 4837 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355609 4837 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355626 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355674 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355690 4837 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355706 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355724 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355740 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355758 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355777 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355801 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355822 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355841 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355860 4837 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.357729 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/338e0d25-c97d-42ec-a8ec-51ddf77a5ed8-proxy-tls\") pod \"machine-config-daemon-2td4d\" (UID: \"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\") " pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.358481 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/501b48f2-bba8-44d4-81df-7a8b7df456b5-cni-binary-copy\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.366787 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43df29f7-1351-41f5-bfca-17f804837cb4-ovn-node-metrics-cert\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.367135 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nj56\" (UniqueName: \"kubernetes.io/projected/86e5afeb-4720-4593-a53e-dfb5381d0b1d-kube-api-access-6nj56\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.368294 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/501b48f2-bba8-44d4-81df-7a8b7df456b5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.368659 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmhlq\" (UniqueName: \"kubernetes.io/projected/501b48f2-bba8-44d4-81df-7a8b7df456b5-kube-api-access-pmhlq\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.369207 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85hll\" (UniqueName: \"kubernetes.io/projected/43df29f7-1351-41f5-bfca-17f804837cb4-kube-api-access-85hll\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.370295 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdh8r\" (UniqueName: \"kubernetes.io/projected/4c126c88-4541-474c-bc1f-5ca9befa3146-kube-api-access-wdh8r\") pod \"node-ca-np68d\" (UID: \"4c126c88-4541-474c-bc1f-5ca9befa3146\") " pod="openshift-image-registry/node-ca-np68d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.371697 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9f5g\" (UniqueName: \"kubernetes.io/projected/e05c56f7-b007-4165-9e29-98cfa865d020-kube-api-access-r9f5g\") pod \"ovnkube-control-plane-749d76644c-dt7fl\" (UID: \"e05c56f7-b007-4165-9e29-98cfa865d020\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.372805 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e05c56f7-b007-4165-9e29-98cfa865d020-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-dt7fl\" (UID: \"e05c56f7-b007-4165-9e29-98cfa865d020\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.379178 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvtx6\" (UniqueName: \"kubernetes.io/projected/338e0d25-c97d-42ec-a8ec-51ddf77a5ed8-kube-api-access-cvtx6\") pod \"machine-config-daemon-2td4d\" (UID: \"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\") " pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.381204 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7ckv\" (UniqueName: \"kubernetes.io/projected/f6398583-f9ff-4b10-829a-503fd523710b-kube-api-access-q7ckv\") pod \"node-resolver-xwmn9\" (UID: \"f6398583-f9ff-4b10-829a-503fd523710b\") " pod="openshift-dns/node-resolver-xwmn9" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.381512 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fqxj\" (UniqueName: \"kubernetes.io/projected/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-kube-api-access-2fqxj\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.427208 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.427606 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.427665 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.427681 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.427703 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.427715 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:35Z","lastTransitionTime":"2026-03-13T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.436236 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:49:35 crc kubenswrapper[4837]: W0313 11:49:35.447856 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-cb20f8c50a429be8b1d5add1f63d65a117337629d00c45bf4b2b6cd2d0def957 WatchSource:0}: Error finding container cb20f8c50a429be8b1d5add1f63d65a117337629d00c45bf4b2b6cd2d0def957: Status 404 returned error can't find the container with id cb20f8c50a429be8b1d5add1f63d65a117337629d00c45bf4b2b6cd2d0def957 Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.456074 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.468510 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-np68d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.473849 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xwmn9" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.482898 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: W0313 11:49:35.484929 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c126c88_4541_474c_bc1f_5ca9befa3146.slice/crio-2881af3742e0d50b334f23b774fbebbacf8b6806c85f4d40f633913a88a6d442 WatchSource:0}: Error finding container 2881af3742e0d50b334f23b774fbebbacf8b6806c85f4d40f633913a88a6d442: Status 404 returned error can't find the container with id 2881af3742e0d50b334f23b774fbebbacf8b6806c85f4d40f633913a88a6d442 Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.489936 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.498427 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.505092 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: W0313 11:49:35.510916 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6398583_f9ff_4b10_829a_503fd523710b.slice/crio-166ac2e7f6f6b6b3592e07d9264bf9325d076196c25c41eff24468e141f1843d WatchSource:0}: Error finding container 166ac2e7f6f6b6b3592e07d9264bf9325d076196c25c41eff24468e141f1843d: Status 404 returned error can't find the container with id 166ac2e7f6f6b6b3592e07d9264bf9325d076196c25c41eff24468e141f1843d Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.515432 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.532514 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.532562 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.532571 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.532588 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.532597 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:35Z","lastTransitionTime":"2026-03-13T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:35 crc kubenswrapper[4837]: W0313 11:49:35.569308 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43df29f7_1351_41f5_bfca_17f804837cb4.slice/crio-17148b76b47a8d352ae2adca8c21dbaa4b189a84d57c2f7678c2d83f59bfc901 WatchSource:0}: Error finding container 17148b76b47a8d352ae2adca8c21dbaa4b189a84d57c2f7678c2d83f59bfc901: Status 404 returned error can't find the container with id 17148b76b47a8d352ae2adca8c21dbaa4b189a84d57c2f7678c2d83f59bfc901 Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.635176 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.635218 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.635228 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.635244 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.635254 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:35Z","lastTransitionTime":"2026-03-13T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.739983 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.740048 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.740057 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.740075 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.740086 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:35Z","lastTransitionTime":"2026-03-13T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.759439 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.759927 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:49:36.759898749 +0000 UTC m=+92.398165512 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.760018 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.760047 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.760132 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.760229 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.760272 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:36.760264371 +0000 UTC m=+92.398531134 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.760481 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.760529 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.760548 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.760613 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:36.760591911 +0000 UTC m=+92.398858854 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.760715 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.760720 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.760769 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:36.760759756 +0000 UTC m=+92.399026729 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.760891 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.760915 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.760927 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.760971 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:36.760956232 +0000 UTC m=+92.399223185 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.842865 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.842924 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.842938 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.842962 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.842975 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:35Z","lastTransitionTime":"2026-03-13T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.862008 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.862177 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.862262 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs podName:86e5afeb-4720-4593-a53e-dfb5381d0b1d nodeName:}" failed. No retries permitted until 2026-03-13 11:49:36.862239572 +0000 UTC m=+92.500506515 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs") pod "network-metrics-daemon-cjn4q" (UID: "86e5afeb-4720-4593-a53e-dfb5381d0b1d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.945898 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.945947 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.945956 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.945973 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.945983 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:35Z","lastTransitionTime":"2026-03-13T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.036711 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.036763 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.036774 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.036792 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.036804 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.047069 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.051859 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.051921 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.051936 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.051955 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.051965 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.061029 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.064970 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.065002 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.065012 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.065028 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.065038 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.074476 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.078677 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.078881 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.078975 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.079095 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.079193 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.090817 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.096218 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.096250 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.096259 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.096276 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.096289 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.107830 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.107949 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.109721 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.109761 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.109771 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.109787 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.109796 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.212538 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.212590 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.212601 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.212623 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.212651 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.315658 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.315745 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.315757 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.315780 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.315794 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.418861 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.418920 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.418933 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.418950 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.418959 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.432934 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60" exitCode=0 Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.432986 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.433036 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerStarted","Data":"17148b76b47a8d352ae2adca8c21dbaa4b189a84d57c2f7678c2d83f59bfc901"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.435509 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.435540 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.435553 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"f78b0dfed51389d19f2f72872d4eb4ed23f39b0b8057b3cf1d510ef956001134"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.437863 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xwmn9" event={"ID":"f6398583-f9ff-4b10-829a-503fd523710b","Type":"ContainerStarted","Data":"81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.437929 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xwmn9" event={"ID":"f6398583-f9ff-4b10-829a-503fd523710b","Type":"ContainerStarted","Data":"166ac2e7f6f6b6b3592e07d9264bf9325d076196c25c41eff24468e141f1843d"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.440238 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.440267 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"81d43c1d485ad8415596ee869abae4167674dbed992582bf1e3cc0ea9b78d6b5"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.442148 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-np68d" event={"ID":"4c126c88-4541-474c-bc1f-5ca9befa3146","Type":"ContainerStarted","Data":"e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.442312 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-np68d" event={"ID":"4c126c88-4541-474c-bc1f-5ca9befa3146","Type":"ContainerStarted","Data":"2881af3742e0d50b334f23b774fbebbacf8b6806c85f4d40f633913a88a6d442"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.445235 4837 generic.go:334] "Generic (PLEG): container finished" podID="501b48f2-bba8-44d4-81df-7a8b7df456b5" containerID="595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1" exitCode=0 Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.445341 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" event={"ID":"501b48f2-bba8-44d4-81df-7a8b7df456b5","Type":"ContainerDied","Data":"595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.445417 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" event={"ID":"501b48f2-bba8-44d4-81df-7a8b7df456b5","Type":"ContainerStarted","Data":"a1b4ca9c1b4c55aa909d80a4fa2f48c689ec4c3090dd6f678eb520f265556c71"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.447351 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" event={"ID":"e05c56f7-b007-4165-9e29-98cfa865d020","Type":"ContainerStarted","Data":"010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.447399 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" event={"ID":"e05c56f7-b007-4165-9e29-98cfa865d020","Type":"ContainerStarted","Data":"35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.447415 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" event={"ID":"e05c56f7-b007-4165-9e29-98cfa865d020","Type":"ContainerStarted","Data":"d6862f2ba91bcb2caa3e47a7b0c9f6fb516532e510a2cd0268bf640898a72c73"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.448616 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qg957" event={"ID":"cbb3f4c6-a6c5-4059-8beb-04179d70aff5","Type":"ContainerStarted","Data":"9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.448701 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qg957" event={"ID":"cbb3f4c6-a6c5-4059-8beb-04179d70aff5","Type":"ContainerStarted","Data":"47c18be3596bf1461dbdaefb54c85ce132865b95bab24e300e61f29af8e5460c"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.450088 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ccefe8faa7e6cf0cf99365286fda2cbf0f3e1517fbef569ff1b331d009363fca"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.452612 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.452752 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.453281 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"cb20f8c50a429be8b1d5add1f63d65a117337629d00c45bf4b2b6cd2d0def957"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.461671 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.482748 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.499649 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.514225 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.521596 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.521660 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.521673 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.521694 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.521707 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.527198 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.538184 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.552476 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.572045 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.589237 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.606879 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.624052 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.624099 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.624110 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.624127 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.624137 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.627841 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.643538 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.659750 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.672452 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.706730 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.727570 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.727627 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.727653 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.727673 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.727687 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.728439 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.761154 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.773202 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.773328 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.773350 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773437 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:49:38.773403051 +0000 UTC m=+94.411669814 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773457 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773516 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:38.773502445 +0000 UTC m=+94.411769208 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773520 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773590 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773603 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773609 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:38.773590387 +0000 UTC m=+94.411857150 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773614 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.773535 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773678 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:38.773636849 +0000 UTC m=+94.411903612 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.773694 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773759 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773770 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773777 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773802 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:38.773796294 +0000 UTC m=+94.412063057 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.774508 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.786437 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.802845 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.817908 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.835866 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.835956 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.835971 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.835989 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.836000 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.835991 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.846543 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.862308 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.874346 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.874573 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.874725 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs podName:86e5afeb-4720-4593-a53e-dfb5381d0b1d nodeName:}" failed. No retries permitted until 2026-03-13 11:49:38.874692681 +0000 UTC m=+94.512959604 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs") pod "network-metrics-daemon-cjn4q" (UID: "86e5afeb-4720-4593-a53e-dfb5381d0b1d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.879591 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.908202 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.923467 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.935810 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.938206 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.938247 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.938255 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.938273 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.938285 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.040528 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.040585 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.040598 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.040618 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.040631 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:37Z","lastTransitionTime":"2026-03-13T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.047370 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.047336 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.047445 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.047336 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:37 crc kubenswrapper[4837]: E0313 11:49:37.047531 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:37 crc kubenswrapper[4837]: E0313 11:49:37.047692 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:37 crc kubenswrapper[4837]: E0313 11:49:37.047767 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:37 crc kubenswrapper[4837]: E0313 11:49:37.047837 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.057064 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.058135 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.058875 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.059555 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.060690 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.061225 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.062216 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.062831 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.063919 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.064427 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.065357 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.066050 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.070335 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.071518 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.072276 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.073691 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.074589 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.076094 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.077444 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.078203 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.079403 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.080182 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.080697 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.081883 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.082384 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.083479 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.084387 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.084923 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.085501 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.086018 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.086574 4837 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.086702 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.088084 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.088613 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.089148 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.090277 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.090948 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.091538 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.095106 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.096393 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.097102 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.098011 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.099294 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.100538 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.101459 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.102336 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.104628 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.105862 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.107019 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.107714 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.108303 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.109628 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.110444 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.111593 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.144063 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.144101 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.144109 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.144127 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.144136 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:37Z","lastTransitionTime":"2026-03-13T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.210770 4837 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.247234 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.247270 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.247281 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.247300 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.247314 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:37Z","lastTransitionTime":"2026-03-13T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.349985 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.350226 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.350235 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.350250 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.350259 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:37Z","lastTransitionTime":"2026-03-13T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.452767 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.452799 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.452807 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.452821 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.452830 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:37Z","lastTransitionTime":"2026-03-13T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.457617 4837 generic.go:334] "Generic (PLEG): container finished" podID="501b48f2-bba8-44d4-81df-7a8b7df456b5" containerID="dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39" exitCode=0 Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.457727 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" event={"ID":"501b48f2-bba8-44d4-81df-7a8b7df456b5","Type":"ContainerDied","Data":"dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.471467 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerStarted","Data":"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.471544 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerStarted","Data":"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.471567 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerStarted","Data":"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.471580 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerStarted","Data":"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.481043 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.494138 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.508312 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.521773 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.535836 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.551052 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.554692 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.554725 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.554738 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.554754 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.554763 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:37Z","lastTransitionTime":"2026-03-13T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.567373 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.581333 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.593944 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.604574 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.617789 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.640389 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.656487 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.657751 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.657780 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.657791 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.657809 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.657822 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:37Z","lastTransitionTime":"2026-03-13T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.669483 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.760695 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.760760 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.760770 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.760788 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.760798 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:37Z","lastTransitionTime":"2026-03-13T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.863848 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.864255 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.864270 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.864290 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.864303 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:37Z","lastTransitionTime":"2026-03-13T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.966530 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.966564 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.966572 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.966593 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.966608 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:37Z","lastTransitionTime":"2026-03-13T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.069685 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.069729 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.069740 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.069756 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.069768 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:38Z","lastTransitionTime":"2026-03-13T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.172460 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.172769 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.172779 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.172794 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.172806 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:38Z","lastTransitionTime":"2026-03-13T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.280140 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.280187 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.280206 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.280224 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.280235 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:38Z","lastTransitionTime":"2026-03-13T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.383243 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.383282 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.383294 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.383311 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.383323 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:38Z","lastTransitionTime":"2026-03-13T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.479687 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerStarted","Data":"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.479761 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerStarted","Data":"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.482711 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.485550 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.485593 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.485608 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.485628 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.485672 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:38Z","lastTransitionTime":"2026-03-13T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.486540 4837 generic.go:334] "Generic (PLEG): container finished" podID="501b48f2-bba8-44d4-81df-7a8b7df456b5" containerID="578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138" exitCode=0 Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.486575 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" event={"ID":"501b48f2-bba8-44d4-81df-7a8b7df456b5","Type":"ContainerDied","Data":"578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.499136 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.519417 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.536958 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.552980 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.567308 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.581415 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.589463 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.589500 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.589509 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.589526 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.589539 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:38Z","lastTransitionTime":"2026-03-13T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.594159 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.610038 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.623140 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.643672 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.664269 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.678735 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.692485 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.692607 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.692625 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.692674 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.692703 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:38Z","lastTransitionTime":"2026-03-13T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.694410 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.706188 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.724790 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.745896 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.760744 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.774812 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.791659 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.792282 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.792462 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:49:42.792436829 +0000 UTC m=+98.430703772 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.792503 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.792548 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.792618 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.792680 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.792786 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:42.792758719 +0000 UTC m=+98.431025672 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.792797 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.792885 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:42.792860642 +0000 UTC m=+98.431127445 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.792893 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.792921 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.792936 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.792984 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:42.792973596 +0000 UTC m=+98.431240559 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.793312 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.793578 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.793627 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.793654 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.793733 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:42.793711029 +0000 UTC m=+98.431977792 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.795673 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.795703 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.795718 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.795738 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.795755 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:38Z","lastTransitionTime":"2026-03-13T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.821323 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.840762 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.858761 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.872379 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.884451 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.894143 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.894315 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.894385 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs podName:86e5afeb-4720-4593-a53e-dfb5381d0b1d nodeName:}" failed. No retries permitted until 2026-03-13 11:49:42.894366019 +0000 UTC m=+98.532632782 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs") pod "network-metrics-daemon-cjn4q" (UID: "86e5afeb-4720-4593-a53e-dfb5381d0b1d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.897452 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.898574 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.898610 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.898622 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.898666 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.898687 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:38Z","lastTransitionTime":"2026-03-13T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.913739 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.931588 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.943670 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.000954 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.001000 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.001008 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.001030 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.001042 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:39Z","lastTransitionTime":"2026-03-13T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.047626 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.047681 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.047662 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.047764 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:39 crc kubenswrapper[4837]: E0313 11:49:39.047801 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:39 crc kubenswrapper[4837]: E0313 11:49:39.047860 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:39 crc kubenswrapper[4837]: E0313 11:49:39.047922 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:39 crc kubenswrapper[4837]: E0313 11:49:39.048190 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.103383 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.103419 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.103428 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.103444 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.103457 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:39Z","lastTransitionTime":"2026-03-13T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.206138 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.206180 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.206191 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.206208 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.206219 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:39Z","lastTransitionTime":"2026-03-13T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.309167 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.309225 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.309240 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.309260 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.309273 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:39Z","lastTransitionTime":"2026-03-13T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.411974 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.412021 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.412033 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.412053 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.412065 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:39Z","lastTransitionTime":"2026-03-13T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.494117 4837 generic.go:334] "Generic (PLEG): container finished" podID="501b48f2-bba8-44d4-81df-7a8b7df456b5" containerID="8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923" exitCode=0 Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.494171 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" event={"ID":"501b48f2-bba8-44d4-81df-7a8b7df456b5","Type":"ContainerDied","Data":"8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923"} Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.509814 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.519015 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.519076 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.519095 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.519121 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.519136 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:39Z","lastTransitionTime":"2026-03-13T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.538836 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.551687 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.565705 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.587588 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.603314 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.618793 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.623543 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.623608 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.623622 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.623668 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.623684 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:39Z","lastTransitionTime":"2026-03-13T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.635965 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.659515 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.681065 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.702261 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.713312 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.723698 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.726547 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.726576 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.726586 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.726605 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.726616 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:39Z","lastTransitionTime":"2026-03-13T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.734036 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.829420 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.829465 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.829473 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.829490 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.829501 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:39Z","lastTransitionTime":"2026-03-13T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.932403 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.932455 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.932473 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.932494 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.932508 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:39Z","lastTransitionTime":"2026-03-13T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.035495 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.035770 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.035992 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.036120 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.036204 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:40Z","lastTransitionTime":"2026-03-13T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.138348 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.138823 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.138838 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.138861 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.138875 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:40Z","lastTransitionTime":"2026-03-13T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.240618 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.240680 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.240694 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.240712 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.240723 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:40Z","lastTransitionTime":"2026-03-13T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.342683 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.342723 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.342734 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.342752 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.342764 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:40Z","lastTransitionTime":"2026-03-13T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.444586 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.444621 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.444629 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.444669 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.444681 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:40Z","lastTransitionTime":"2026-03-13T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.501971 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerStarted","Data":"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a"} Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.504865 4837 generic.go:334] "Generic (PLEG): container finished" podID="501b48f2-bba8-44d4-81df-7a8b7df456b5" containerID="6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f" exitCode=0 Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.504902 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" event={"ID":"501b48f2-bba8-44d4-81df-7a8b7df456b5","Type":"ContainerDied","Data":"6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f"} Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.520934 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.536196 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.548292 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.548324 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.548337 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.548357 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.548371 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:40Z","lastTransitionTime":"2026-03-13T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.550795 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.566166 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.579528 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.593246 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.606796 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.620374 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.633879 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.651343 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.651891 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.651922 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.651934 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.651951 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.651960 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:40Z","lastTransitionTime":"2026-03-13T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.668684 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.681051 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.693561 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.703713 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.753792 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.753824 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.753836 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.753853 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.753864 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:40Z","lastTransitionTime":"2026-03-13T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.857265 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.857317 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.857328 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.857345 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.857355 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:40Z","lastTransitionTime":"2026-03-13T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.960391 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.960433 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.960443 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.960460 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.960472 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:40Z","lastTransitionTime":"2026-03-13T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.048239 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.048303 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.048274 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:41 crc kubenswrapper[4837]: E0313 11:49:41.048433 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.048450 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:41 crc kubenswrapper[4837]: E0313 11:49:41.048541 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:41 crc kubenswrapper[4837]: E0313 11:49:41.048617 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:41 crc kubenswrapper[4837]: E0313 11:49:41.048788 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.062824 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.062873 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.062890 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.062911 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.062928 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:41Z","lastTransitionTime":"2026-03-13T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.166380 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.166435 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.166449 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.166470 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.166482 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:41Z","lastTransitionTime":"2026-03-13T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.269613 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.269695 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.269780 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.269813 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.269826 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:41Z","lastTransitionTime":"2026-03-13T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.373280 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.373553 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.373712 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.374288 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.374633 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:41Z","lastTransitionTime":"2026-03-13T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.476980 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.477056 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.477081 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.477112 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.477136 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:41Z","lastTransitionTime":"2026-03-13T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.517927 4837 generic.go:334] "Generic (PLEG): container finished" podID="501b48f2-bba8-44d4-81df-7a8b7df456b5" containerID="abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11" exitCode=0 Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.518002 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" event={"ID":"501b48f2-bba8-44d4-81df-7a8b7df456b5","Type":"ContainerDied","Data":"abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11"} Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.537332 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.554986 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.570658 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.580279 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.580341 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.580355 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.580377 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.580396 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:41Z","lastTransitionTime":"2026-03-13T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.585987 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.599790 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.616667 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.629224 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.640738 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.655814 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.677418 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.682556 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.682590 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.682597 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.682611 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.682623 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:41Z","lastTransitionTime":"2026-03-13T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.694676 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.707020 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.720046 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.741065 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.785212 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.785255 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.785264 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.785280 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.785310 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:41Z","lastTransitionTime":"2026-03-13T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.889153 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.889208 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.889229 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.889253 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.889270 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:41Z","lastTransitionTime":"2026-03-13T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.992864 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.992955 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.992979 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.993012 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.993037 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:41Z","lastTransitionTime":"2026-03-13T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.065310 4837 scope.go:117] "RemoveContainer" containerID="6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8" Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.065515 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.066155 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.102502 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.102558 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.102574 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.102597 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.102612 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:42Z","lastTransitionTime":"2026-03-13T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.205588 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.205625 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.205634 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.205662 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.205671 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:42Z","lastTransitionTime":"2026-03-13T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.308836 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.309118 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.309127 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.309143 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.309154 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:42Z","lastTransitionTime":"2026-03-13T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.411985 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.412036 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.412047 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.412064 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.412074 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:42Z","lastTransitionTime":"2026-03-13T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.514890 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.514971 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.514992 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.515022 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.515045 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:42Z","lastTransitionTime":"2026-03-13T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.526077 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" event={"ID":"501b48f2-bba8-44d4-81df-7a8b7df456b5","Type":"ContainerStarted","Data":"ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128"} Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.531910 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerStarted","Data":"1682ba45a5caded567709ca21681b997665e2b7d3be2fade571b7391f8e1ec9b"} Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.532524 4837 scope.go:117] "RemoveContainer" containerID="6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8" Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.532747 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.543596 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.560946 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.575545 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.592564 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.605606 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.618018 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.618542 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.618566 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.618573 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.618587 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.618597 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:42Z","lastTransitionTime":"2026-03-13T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.641598 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.658953 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.674425 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.684399 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.699623 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.713698 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.720973 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.721014 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.721026 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.721047 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.721059 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:42Z","lastTransitionTime":"2026-03-13T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.725913 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.741327 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.760254 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.781781 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.798863 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.808985 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.821012 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.823011 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.823071 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.823089 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.823110 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.823124 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:42Z","lastTransitionTime":"2026-03-13T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.835143 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.842649 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.842750 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.842782 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.842827 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.842852 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843024 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843050 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843064 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843113 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:50.843097184 +0000 UTC m=+106.481363947 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843433 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:49:50.843418414 +0000 UTC m=+106.481685177 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843477 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843518 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:50.843500997 +0000 UTC m=+106.481767760 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843574 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843606 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:50.84359569 +0000 UTC m=+106.481862453 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843687 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843711 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843721 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843752 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:50.843742655 +0000 UTC m=+106.482009428 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.844739 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.858853 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.869466 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.880268 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.891202 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.904951 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.920524 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1682ba45a5caded567709ca21681b997665e2b7d3be2fade571b7391f8e1ec9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.925230 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.925272 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.925283 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.925302 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.925316 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:42Z","lastTransitionTime":"2026-03-13T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.932166 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.943418 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.943663 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.943746 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs podName:86e5afeb-4720-4593-a53e-dfb5381d0b1d nodeName:}" failed. No retries permitted until 2026-03-13 11:49:50.943723843 +0000 UTC m=+106.581990616 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs") pod "network-metrics-daemon-cjn4q" (UID: "86e5afeb-4720-4593-a53e-dfb5381d0b1d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.946771 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.958649 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.027834 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.027886 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.027904 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.027926 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.027940 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:43Z","lastTransitionTime":"2026-03-13T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.047150 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:43 crc kubenswrapper[4837]: E0313 11:49:43.047275 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.047673 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:43 crc kubenswrapper[4837]: E0313 11:49:43.047754 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.047817 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:43 crc kubenswrapper[4837]: E0313 11:49:43.047925 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.047966 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:43 crc kubenswrapper[4837]: E0313 11:49:43.048056 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.130540 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.130606 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.130623 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.130683 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.130711 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:43Z","lastTransitionTime":"2026-03-13T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.234084 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.234170 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.234189 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.234219 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.234289 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:43Z","lastTransitionTime":"2026-03-13T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.336917 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.336981 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.336991 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.337006 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.337016 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:43Z","lastTransitionTime":"2026-03-13T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.440061 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.440419 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.440612 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.440860 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.441037 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:43Z","lastTransitionTime":"2026-03-13T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.536239 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.536383 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.536459 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.544906 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.544959 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.544973 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.544997 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.545018 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:43Z","lastTransitionTime":"2026-03-13T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.569280 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.569375 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.583768 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.596413 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.610378 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.624495 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.641666 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1682ba45a5caded567709ca21681b997665e2b7d3be2fade571b7391f8e1ec9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.647339 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.647377 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.647411 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.647429 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.647441 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:43Z","lastTransitionTime":"2026-03-13T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.655949 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.671021 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.681793 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.692701 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.709189 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.721820 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.737473 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.748310 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.749973 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.750006 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.750015 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.750031 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.750057 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:43Z","lastTransitionTime":"2026-03-13T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.757828 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.770556 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.782513 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.794789 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.806415 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.817091 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.826304 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.842501 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.852777 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.852813 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.852822 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.852836 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.852846 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:43Z","lastTransitionTime":"2026-03-13T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.854440 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.865479 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.880440 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.905733 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1682ba45a5caded567709ca21681b997665e2b7d3be2fade571b7391f8e1ec9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.921165 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.940315 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.954980 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.955575 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.955657 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.955674 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.955699 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.955718 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:43Z","lastTransitionTime":"2026-03-13T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.967265 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.977096 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.058933 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.058991 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.059008 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.059037 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.059056 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:44Z","lastTransitionTime":"2026-03-13T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.162114 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.162164 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.162176 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.162198 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.162211 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:44Z","lastTransitionTime":"2026-03-13T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.265739 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.265790 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.265802 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.265821 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.265832 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:44Z","lastTransitionTime":"2026-03-13T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.368199 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.368243 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.368254 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.368273 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.368284 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:44Z","lastTransitionTime":"2026-03-13T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.470910 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.470976 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.470994 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.471019 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.471037 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:44Z","lastTransitionTime":"2026-03-13T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.575823 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.575866 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.575877 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.575895 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.575908 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:44Z","lastTransitionTime":"2026-03-13T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.678570 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.678668 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.678684 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.678701 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.678711 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:44Z","lastTransitionTime":"2026-03-13T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.781445 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.781519 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.781541 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.781798 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.781817 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:44Z","lastTransitionTime":"2026-03-13T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.884555 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.884591 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.884599 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.884614 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.884623 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:44Z","lastTransitionTime":"2026-03-13T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.987794 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.987839 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.987849 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.987870 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.987883 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:44Z","lastTransitionTime":"2026-03-13T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.047630 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.047685 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.047769 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.047780 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:45 crc kubenswrapper[4837]: E0313 11:49:45.047930 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:45 crc kubenswrapper[4837]: E0313 11:49:45.048041 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:45 crc kubenswrapper[4837]: E0313 11:49:45.048136 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:45 crc kubenswrapper[4837]: E0313 11:49:45.048476 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.060616 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.071556 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.083474 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.090536 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.090622 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.090632 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.090674 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.090688 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:45Z","lastTransitionTime":"2026-03-13T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.096162 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.109130 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.122964 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.134078 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.145453 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.162505 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.173479 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.187032 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.193932 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.193975 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.193987 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.194005 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.194020 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:45Z","lastTransitionTime":"2026-03-13T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.200953 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.219047 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1682ba45a5caded567709ca21681b997665e2b7d3be2fade571b7391f8e1ec9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.236266 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.254362 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.297074 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.297350 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.297437 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.297544 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.297709 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:45Z","lastTransitionTime":"2026-03-13T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.401724 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.402008 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.402131 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.402214 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.402286 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:45Z","lastTransitionTime":"2026-03-13T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.505544 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.505817 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.505875 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.505934 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.505988 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:45Z","lastTransitionTime":"2026-03-13T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.544138 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/0.log" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.546421 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="1682ba45a5caded567709ca21681b997665e2b7d3be2fade571b7391f8e1ec9b" exitCode=1 Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.546571 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"1682ba45a5caded567709ca21681b997665e2b7d3be2fade571b7391f8e1ec9b"} Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.547263 4837 scope.go:117] "RemoveContainer" containerID="1682ba45a5caded567709ca21681b997665e2b7d3be2fade571b7391f8e1ec9b" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.563706 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.579668 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.593528 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.608690 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.609767 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.609804 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.609815 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.609832 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.609842 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:45Z","lastTransitionTime":"2026-03-13T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.621474 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.636455 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.650538 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.665462 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.681405 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.707840 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1682ba45a5caded567709ca21681b997665e2b7d3be2fade571b7391f8e1ec9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1682ba45a5caded567709ca21681b997665e2b7d3be2fade571b7391f8e1ec9b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:45Z\\\",\\\"message\\\":\\\"formers/factory.go:160\\\\nI0313 11:49:45.067303 6780 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:45.067813 6780 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:49:45.067853 6780 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 11:49:45.067873 6780 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:49:45.067899 6780 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:49:45.067916 6780 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:49:45.067922 6780 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:49:45.067948 6780 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 11:49:45.067960 6780 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:49:45.067970 6780 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 11:49:45.067980 6780 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:49:45.067989 6780 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:49:45.068001 6780 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:49:45.068051 6780 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.712781 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.712829 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.712846 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.712871 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.712890 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:45Z","lastTransitionTime":"2026-03-13T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.723507 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.741319 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.754400 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.764303 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.774877 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.816346 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.816397 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.816406 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.816424 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.816435 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:45Z","lastTransitionTime":"2026-03-13T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.919757 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.919815 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.919835 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.919877 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.919891 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:45Z","lastTransitionTime":"2026-03-13T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.022493 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.022540 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.022554 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.022571 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.022580 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.125626 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.125691 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.125707 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.125726 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.125738 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.228359 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.228401 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.228411 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.228426 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.228437 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.331617 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.331683 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.331692 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.331709 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.331724 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.387681 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.387735 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.387752 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.387772 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.387783 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: E0313 11:49:46.401514 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.406057 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.406092 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.406101 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.406123 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.406133 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: E0313 11:49:46.418522 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.422779 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.422917 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.423000 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.423081 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.423140 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: E0313 11:49:46.436291 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.446409 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.446501 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.446515 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.446540 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.446884 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: E0313 11:49:46.461756 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.466559 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.466607 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.466618 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.466648 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.466689 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: E0313 11:49:46.480387 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: E0313 11:49:46.480619 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.482468 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.482513 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.482529 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.482553 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.482568 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.551341 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/1.log" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.552120 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/0.log" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.555204 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4" exitCode=1 Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.555258 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4"} Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.555311 4837 scope.go:117] "RemoveContainer" containerID="1682ba45a5caded567709ca21681b997665e2b7d3be2fade571b7391f8e1ec9b" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.556240 4837 scope.go:117] "RemoveContainer" containerID="e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4" Mar 13 11:49:46 crc kubenswrapper[4837]: E0313 11:49:46.556477 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.570611 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.586004 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.586060 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.586072 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.586093 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.586108 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.586366 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.600820 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.617622 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.637192 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.657305 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.679711 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.689202 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.689256 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.689270 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.689290 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.689304 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.695147 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.714795 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.731718 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.749317 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.767716 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.792089 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.792140 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.792152 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.792176 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.792190 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.801442 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1682ba45a5caded567709ca21681b997665e2b7d3be2fade571b7391f8e1ec9b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:45Z\\\",\\\"message\\\":\\\"formers/factory.go:160\\\\nI0313 11:49:45.067303 6780 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:45.067813 6780 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:49:45.067853 6780 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 11:49:45.067873 6780 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:49:45.067899 6780 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:49:45.067916 6780 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:49:45.067922 6780 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:49:45.067948 6780 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 11:49:45.067960 6780 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:49:45.067970 6780 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 11:49:45.067980 6780 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:49:45.067989 6780 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:49:45.068001 6780 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:49:45.068051 6780 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\":160\\\\nI0313 11:49:46.362104 6913 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:46.362220 6913 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 11:49:46.362340 6913 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:49:46.362469 6913 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:49:46.362889 6913 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:46.374778 6913 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0313 11:49:46.374817 6913 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0313 11:49:46.374885 6913 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:49:46.374926 6913 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:49:46.375025 6913 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.823044 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.841268 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.895088 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.895137 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.895148 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.895164 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.895175 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.998125 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.998215 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.998270 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.998301 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.998320 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.047561 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.047599 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.047684 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.047693 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:47 crc kubenswrapper[4837]: E0313 11:49:47.047835 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:47 crc kubenswrapper[4837]: E0313 11:49:47.047948 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:47 crc kubenswrapper[4837]: E0313 11:49:47.048085 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:47 crc kubenswrapper[4837]: E0313 11:49:47.048246 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.101458 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.101495 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.101504 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.101517 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.101527 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:47Z","lastTransitionTime":"2026-03-13T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.205516 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.205578 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.205590 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.205609 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.205624 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:47Z","lastTransitionTime":"2026-03-13T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.308499 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.308584 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.308680 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.308719 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.308743 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:47Z","lastTransitionTime":"2026-03-13T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.411406 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.411469 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.411481 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.411510 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.411524 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:47Z","lastTransitionTime":"2026-03-13T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.513850 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.513904 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.513917 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.513935 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.513950 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:47Z","lastTransitionTime":"2026-03-13T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.562964 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/1.log" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.569482 4837 scope.go:117] "RemoveContainer" containerID="e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4" Mar 13 11:49:47 crc kubenswrapper[4837]: E0313 11:49:47.569739 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.582406 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.596220 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.611998 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.619137 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.619196 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.619210 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.619230 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.619245 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:47Z","lastTransitionTime":"2026-03-13T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.626390 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.635538 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.646490 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.656608 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.667709 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.678153 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.696760 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\":160\\\\nI0313 11:49:46.362104 6913 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:46.362220 6913 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 11:49:46.362340 6913 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:49:46.362469 6913 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:49:46.362889 6913 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:46.374778 6913 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0313 11:49:46.374817 6913 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0313 11:49:46.374885 6913 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:49:46.374926 6913 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:49:46.375025 6913 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.711101 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.722025 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.722215 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.722277 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.722360 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.722420 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:47Z","lastTransitionTime":"2026-03-13T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.729848 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.743969 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.757077 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.769388 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.825895 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.825933 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.825941 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.825955 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.825964 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:47Z","lastTransitionTime":"2026-03-13T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.928570 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.928674 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.928689 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.928713 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.928727 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:47Z","lastTransitionTime":"2026-03-13T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.031745 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.031781 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.031792 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.031810 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.031823 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:48Z","lastTransitionTime":"2026-03-13T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.070872 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.135350 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.135400 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.135412 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.135432 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.135448 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:48Z","lastTransitionTime":"2026-03-13T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.249378 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.249445 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.249461 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.249489 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.249507 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:48Z","lastTransitionTime":"2026-03-13T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.353311 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.353361 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.353369 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.353391 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.353400 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:48Z","lastTransitionTime":"2026-03-13T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.456242 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.456297 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.456312 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.456336 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.456351 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:48Z","lastTransitionTime":"2026-03-13T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.559802 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.559842 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.559852 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.559868 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.559878 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:48Z","lastTransitionTime":"2026-03-13T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.662947 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.663009 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.663027 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.663054 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.663071 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:48Z","lastTransitionTime":"2026-03-13T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.765558 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.765607 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.765622 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.765675 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.765695 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:48Z","lastTransitionTime":"2026-03-13T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.868233 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.868295 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.868316 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.868347 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.868370 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:48Z","lastTransitionTime":"2026-03-13T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.971040 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.971078 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.971087 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.971101 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.971110 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:48Z","lastTransitionTime":"2026-03-13T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.048194 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.048239 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.048270 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.048273 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:49 crc kubenswrapper[4837]: E0313 11:49:49.048373 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:49 crc kubenswrapper[4837]: E0313 11:49:49.048469 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:49 crc kubenswrapper[4837]: E0313 11:49:49.048548 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:49 crc kubenswrapper[4837]: E0313 11:49:49.048596 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.073384 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.073431 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.073442 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.073460 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.073473 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:49Z","lastTransitionTime":"2026-03-13T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.178068 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.178423 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.178565 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.178756 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.178897 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:49Z","lastTransitionTime":"2026-03-13T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.282071 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.282123 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.282135 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.282153 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.282169 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:49Z","lastTransitionTime":"2026-03-13T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.389863 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.389931 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.389949 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.389974 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.389991 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:49Z","lastTransitionTime":"2026-03-13T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.493075 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.493157 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.493183 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.493215 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.493240 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:49Z","lastTransitionTime":"2026-03-13T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.596898 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.596962 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.596979 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.597004 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.597022 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:49Z","lastTransitionTime":"2026-03-13T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.699479 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.699524 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.699539 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.699558 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.699574 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:49Z","lastTransitionTime":"2026-03-13T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.802410 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.802477 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.802495 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.802522 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.802540 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:49Z","lastTransitionTime":"2026-03-13T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.906270 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.906352 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.906381 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.906415 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.906442 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:49Z","lastTransitionTime":"2026-03-13T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.009736 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.009799 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.009816 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.009842 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.009860 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:50Z","lastTransitionTime":"2026-03-13T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.112704 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.112743 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.112754 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.112775 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.112787 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:50Z","lastTransitionTime":"2026-03-13T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.215662 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.215727 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.215739 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.215763 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.215779 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:50Z","lastTransitionTime":"2026-03-13T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.320766 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.320824 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.320834 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.320852 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.320863 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:50Z","lastTransitionTime":"2026-03-13T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.424195 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.424240 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.424251 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.424274 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.424283 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:50Z","lastTransitionTime":"2026-03-13T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.526741 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.526799 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.526811 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.526845 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.526859 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:50Z","lastTransitionTime":"2026-03-13T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.630184 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.630227 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.630239 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.630257 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.630270 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:50Z","lastTransitionTime":"2026-03-13T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.734086 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.734243 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.734267 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.734293 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.734311 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:50Z","lastTransitionTime":"2026-03-13T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.837710 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.837764 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.837777 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.837800 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.837818 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:50Z","lastTransitionTime":"2026-03-13T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.929877 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930023 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:06.929999289 +0000 UTC m=+122.568266062 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.930101 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.930136 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.930175 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.930198 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930218 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930259 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:50:06.930249657 +0000 UTC m=+122.568516420 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930351 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930460 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:50:06.930441783 +0000 UTC m=+122.568708546 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930470 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930531 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930549 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930371 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930626 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 11:50:06.930602547 +0000 UTC m=+122.568869320 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930630 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930668 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930723 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 11:50:06.93070805 +0000 UTC m=+122.568975003 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.940349 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.940393 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.940407 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.940429 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.940446 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:50Z","lastTransitionTime":"2026-03-13T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.030775 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:51 crc kubenswrapper[4837]: E0313 11:49:51.030953 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:51 crc kubenswrapper[4837]: E0313 11:49:51.031026 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs podName:86e5afeb-4720-4593-a53e-dfb5381d0b1d nodeName:}" failed. No retries permitted until 2026-03-13 11:50:07.03100895 +0000 UTC m=+122.669275723 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs") pod "network-metrics-daemon-cjn4q" (UID: "86e5afeb-4720-4593-a53e-dfb5381d0b1d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.042950 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.043027 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.043052 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.043084 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.043109 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:51Z","lastTransitionTime":"2026-03-13T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.047183 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.047230 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.047248 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.047287 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:51 crc kubenswrapper[4837]: E0313 11:49:51.047376 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:51 crc kubenswrapper[4837]: E0313 11:49:51.047483 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:51 crc kubenswrapper[4837]: E0313 11:49:51.047561 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:51 crc kubenswrapper[4837]: E0313 11:49:51.047694 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.146375 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.146451 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.146473 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.146502 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.146524 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:51Z","lastTransitionTime":"2026-03-13T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.249956 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.250000 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.250012 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.250031 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.250045 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:51Z","lastTransitionTime":"2026-03-13T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.353287 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.353339 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.353374 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.353395 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.353411 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:51Z","lastTransitionTime":"2026-03-13T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.455719 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.455782 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.455799 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.455824 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.455841 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:51Z","lastTransitionTime":"2026-03-13T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.558465 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.558548 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.558573 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.558604 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.558626 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:51Z","lastTransitionTime":"2026-03-13T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.662151 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.662251 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.662278 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.662313 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.662337 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:51Z","lastTransitionTime":"2026-03-13T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.765267 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.765343 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.765361 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.765402 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.765435 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:51Z","lastTransitionTime":"2026-03-13T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.873081 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.873123 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.873135 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.873153 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.873165 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:51Z","lastTransitionTime":"2026-03-13T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.974928 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.974978 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.974996 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.975013 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.975026 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:51Z","lastTransitionTime":"2026-03-13T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.078369 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.078434 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.078449 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.078478 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.078495 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:52Z","lastTransitionTime":"2026-03-13T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.183811 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.183884 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.183896 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.183954 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.183984 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:52Z","lastTransitionTime":"2026-03-13T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.287585 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.287661 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.287672 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.287690 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.287700 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:52Z","lastTransitionTime":"2026-03-13T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.390860 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.390901 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.390912 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.390930 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.390941 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:52Z","lastTransitionTime":"2026-03-13T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.494249 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.494302 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.494320 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.494344 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.494361 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:52Z","lastTransitionTime":"2026-03-13T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.597349 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.597418 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.597434 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.597465 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.597490 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:52Z","lastTransitionTime":"2026-03-13T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.700913 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.700975 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.700984 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.701001 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.701011 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:52Z","lastTransitionTime":"2026-03-13T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.804785 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.804831 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.804840 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.804855 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.804865 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:52Z","lastTransitionTime":"2026-03-13T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.906898 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.906944 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.906953 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.906972 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.906984 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:52Z","lastTransitionTime":"2026-03-13T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.009990 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.010048 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.010060 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.010081 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.010094 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:53Z","lastTransitionTime":"2026-03-13T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.047985 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.048053 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.048053 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:53 crc kubenswrapper[4837]: E0313 11:49:53.048220 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.048297 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:53 crc kubenswrapper[4837]: E0313 11:49:53.048471 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:53 crc kubenswrapper[4837]: E0313 11:49:53.048563 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:53 crc kubenswrapper[4837]: E0313 11:49:53.048702 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.112317 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.112700 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.112880 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.113077 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.113264 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:53Z","lastTransitionTime":"2026-03-13T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.216416 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.216459 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.216471 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.216505 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.216520 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:53Z","lastTransitionTime":"2026-03-13T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.319617 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.319723 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.319747 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.319778 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.319865 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:53Z","lastTransitionTime":"2026-03-13T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.422325 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.422387 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.422403 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.422423 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.422432 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:53Z","lastTransitionTime":"2026-03-13T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.525920 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.525993 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.526007 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.526032 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.526048 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:53Z","lastTransitionTime":"2026-03-13T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.628173 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.628221 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.628235 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.628260 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.628274 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:53Z","lastTransitionTime":"2026-03-13T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.731522 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.731571 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.731584 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.731606 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.731619 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:53Z","lastTransitionTime":"2026-03-13T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.834973 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.835041 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.835060 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.835091 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.835113 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:53Z","lastTransitionTime":"2026-03-13T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.938532 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.938595 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.938608 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.938632 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.938670 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:53Z","lastTransitionTime":"2026-03-13T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.041702 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.041758 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.041769 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.041788 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.041799 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:54Z","lastTransitionTime":"2026-03-13T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.048965 4837 scope.go:117] "RemoveContainer" containerID="6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.144973 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.145024 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.145034 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.145053 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.145066 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:54Z","lastTransitionTime":"2026-03-13T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.248141 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.248181 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.248189 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.248224 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.248235 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:54Z","lastTransitionTime":"2026-03-13T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.350515 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.350570 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.350579 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.350597 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.350609 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:54Z","lastTransitionTime":"2026-03-13T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.453562 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.453604 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.453612 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.453630 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.453642 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:54Z","lastTransitionTime":"2026-03-13T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.556525 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.556586 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.556604 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.556627 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.556682 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:54Z","lastTransitionTime":"2026-03-13T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.597366 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.599140 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea"} Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.599591 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.612896 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.623479 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.634835 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.645117 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.655918 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.659470 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.659509 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.659520 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.659539 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.659551 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:54Z","lastTransitionTime":"2026-03-13T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.674022 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\":160\\\\nI0313 11:49:46.362104 6913 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:46.362220 6913 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 11:49:46.362340 6913 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:49:46.362469 6913 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:49:46.362889 6913 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:46.374778 6913 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0313 11:49:46.374817 6913 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0313 11:49:46.374885 6913 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:49:46.374926 6913 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:49:46.375025 6913 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.692311 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.705084 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.717170 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.727376 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.739415 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.753727 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.762540 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.762765 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.762881 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.763150 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.763238 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:54Z","lastTransitionTime":"2026-03-13T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.767546 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.778417 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.787286 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.800319 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.866573 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.866610 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.866621 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.866640 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.866666 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:54Z","lastTransitionTime":"2026-03-13T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.969307 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.969385 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.969407 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.969438 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.969457 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:54Z","lastTransitionTime":"2026-03-13T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.047418 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.047506 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.047575 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:55 crc kubenswrapper[4837]: E0313 11:49:55.047719 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:55 crc kubenswrapper[4837]: E0313 11:49:55.047889 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.048027 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:55 crc kubenswrapper[4837]: E0313 11:49:55.048117 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:55 crc kubenswrapper[4837]: E0313 11:49:55.048346 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.066563 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.071222 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.071261 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.071271 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.071286 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.071298 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:55Z","lastTransitionTime":"2026-03-13T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.081265 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.091958 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.107658 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.119400 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.132256 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.143560 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.158970 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.178353 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.178426 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.178439 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.178463 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.178481 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:55Z","lastTransitionTime":"2026-03-13T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.178569 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.195134 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.214160 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.228229 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.244836 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.271471 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\":160\\\\nI0313 11:49:46.362104 6913 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:46.362220 6913 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 11:49:46.362340 6913 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:49:46.362469 6913 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:49:46.362889 6913 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:46.374778 6913 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0313 11:49:46.374817 6913 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0313 11:49:46.374885 6913 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:49:46.374926 6913 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:49:46.375025 6913 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.280700 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.280728 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.280739 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.280752 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.280762 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:55Z","lastTransitionTime":"2026-03-13T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.288096 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.305777 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.382870 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.382920 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.382928 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.382943 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.382953 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:55Z","lastTransitionTime":"2026-03-13T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.485348 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.485390 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.485400 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.485435 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.485447 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:55Z","lastTransitionTime":"2026-03-13T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.587611 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.587683 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.587697 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.587740 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.587753 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:55Z","lastTransitionTime":"2026-03-13T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.690019 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.690059 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.690070 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.690105 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.690114 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:55Z","lastTransitionTime":"2026-03-13T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.793221 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.793260 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.793270 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.793289 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.793300 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:55Z","lastTransitionTime":"2026-03-13T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.897132 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.897711 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.897795 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.897824 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.897875 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:55Z","lastTransitionTime":"2026-03-13T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.001356 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.001424 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.001446 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.001477 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.001503 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.104161 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.104248 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.104310 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.104347 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.104362 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.206851 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.206963 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.206977 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.207000 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.207015 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.309228 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.309282 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.309296 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.309322 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.309335 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.412199 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.412276 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.412287 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.412306 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.412320 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.515734 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.516091 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.516170 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.516238 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.516322 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.517482 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.517526 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.517539 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.517581 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.517593 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: E0313 11:49:56.532710 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:56Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.537457 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.537510 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.537524 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.537547 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.537562 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: E0313 11:49:56.551507 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:56Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.557895 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.558259 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.558378 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.558477 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.558561 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: E0313 11:49:56.575327 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:56Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.581186 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.581216 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.581226 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.581242 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.581252 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: E0313 11:49:56.600067 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:56Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.604773 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.604810 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.604821 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.604841 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.604851 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: E0313 11:49:56.624448 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:56Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:56 crc kubenswrapper[4837]: E0313 11:49:56.624786 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.627329 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.627357 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.627368 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.627386 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.627398 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.730287 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.730326 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.730338 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.730357 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.730370 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.833750 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.833846 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.833878 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.833916 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.833936 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.936675 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.936710 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.936722 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.936740 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.936752 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.039457 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.039506 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.039522 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.039543 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.039559 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:57Z","lastTransitionTime":"2026-03-13T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.048036 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.048062 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.048117 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:57 crc kubenswrapper[4837]: E0313 11:49:57.048207 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:57 crc kubenswrapper[4837]: E0313 11:49:57.048296 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.048339 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:57 crc kubenswrapper[4837]: E0313 11:49:57.048522 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:57 crc kubenswrapper[4837]: E0313 11:49:57.048742 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.062716 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.142845 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.142891 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.142904 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.142926 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.142939 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:57Z","lastTransitionTime":"2026-03-13T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.245881 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.245915 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.245925 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.245941 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.245953 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:57Z","lastTransitionTime":"2026-03-13T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.348975 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.349018 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.349027 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.349045 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.349057 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:57Z","lastTransitionTime":"2026-03-13T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.453196 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.453248 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.453260 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.453277 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.453288 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:57Z","lastTransitionTime":"2026-03-13T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.556155 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.556211 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.556229 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.556255 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.556273 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:57Z","lastTransitionTime":"2026-03-13T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.660372 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.660461 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.660476 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.660498 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.660513 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:57Z","lastTransitionTime":"2026-03-13T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.764146 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.764190 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.764201 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.764218 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.764228 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:57Z","lastTransitionTime":"2026-03-13T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.866045 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.866090 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.866101 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.866119 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.866129 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:57Z","lastTransitionTime":"2026-03-13T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.968780 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.968848 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.968871 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.968903 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.968927 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:57Z","lastTransitionTime":"2026-03-13T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.072480 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.072554 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.072577 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.072607 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.072632 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:58Z","lastTransitionTime":"2026-03-13T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.175974 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.176063 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.176087 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.176129 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.176152 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:58Z","lastTransitionTime":"2026-03-13T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.279190 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.279264 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.279291 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.279327 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.279351 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:58Z","lastTransitionTime":"2026-03-13T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.382975 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.383052 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.383072 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.383096 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.383114 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:58Z","lastTransitionTime":"2026-03-13T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.486862 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.486931 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.486949 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.486975 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.486994 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:58Z","lastTransitionTime":"2026-03-13T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.590830 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.590876 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.590889 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.590918 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.590933 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:58Z","lastTransitionTime":"2026-03-13T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.694495 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.694561 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.694578 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.694605 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.694623 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:58Z","lastTransitionTime":"2026-03-13T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.797964 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.798348 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.798426 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.798537 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.798609 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:58Z","lastTransitionTime":"2026-03-13T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.901817 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.901868 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.901883 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.901908 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.901924 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:58Z","lastTransitionTime":"2026-03-13T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.005420 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.005501 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.005519 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.005550 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.005574 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:59Z","lastTransitionTime":"2026-03-13T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.047357 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.047473 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.047473 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.047692 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:59 crc kubenswrapper[4837]: E0313 11:49:59.048008 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:59 crc kubenswrapper[4837]: E0313 11:49:59.047846 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:59 crc kubenswrapper[4837]: E0313 11:49:59.048142 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:59 crc kubenswrapper[4837]: E0313 11:49:59.048246 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.109138 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.109841 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.109893 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.109935 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.109954 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:59Z","lastTransitionTime":"2026-03-13T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.213779 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.213837 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.213854 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.213878 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.213899 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:59Z","lastTransitionTime":"2026-03-13T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.317567 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.317914 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.318023 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.318140 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.318242 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:59Z","lastTransitionTime":"2026-03-13T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.424404 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.424453 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.424467 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.424487 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.424504 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:59Z","lastTransitionTime":"2026-03-13T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.526867 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.526909 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.526919 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.526935 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.526945 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:59Z","lastTransitionTime":"2026-03-13T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.629432 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.629694 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.629723 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.629788 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.629808 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:59Z","lastTransitionTime":"2026-03-13T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.733581 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.733688 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.733704 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.733753 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.733769 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:59Z","lastTransitionTime":"2026-03-13T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.836601 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.836690 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.836714 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.836749 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.836772 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:59Z","lastTransitionTime":"2026-03-13T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.939405 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.939466 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.939483 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.939510 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.939533 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:59Z","lastTransitionTime":"2026-03-13T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.042569 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.042706 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.042737 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.042769 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.042791 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:00Z","lastTransitionTime":"2026-03-13T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.049304 4837 scope.go:117] "RemoveContainer" containerID="e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.146044 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.146112 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.146132 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.146159 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.146183 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:00Z","lastTransitionTime":"2026-03-13T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.249156 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.249199 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.249210 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.249228 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.249239 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:00Z","lastTransitionTime":"2026-03-13T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.351261 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.351313 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.351330 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.351350 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.351364 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:00Z","lastTransitionTime":"2026-03-13T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.453919 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.453963 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.453973 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.453987 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.453997 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:00Z","lastTransitionTime":"2026-03-13T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.559083 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.559133 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.559148 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.559167 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.559179 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:00Z","lastTransitionTime":"2026-03-13T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.617146 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/1.log" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.620302 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerStarted","Data":"7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7"} Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.620827 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.636787 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.651287 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.661924 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.661973 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.661987 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.662008 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.662021 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:00Z","lastTransitionTime":"2026-03-13T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.666094 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.679901 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.690598 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.704126 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.716472 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.730121 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.741497 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.753584 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.763812 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.763843 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.763852 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.763867 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.763877 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:00Z","lastTransitionTime":"2026-03-13T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.776456 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\":160\\\\nI0313 11:49:46.362104 6913 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:46.362220 6913 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 11:49:46.362340 6913 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:49:46.362469 6913 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:49:46.362889 6913 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:46.374778 6913 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0313 11:49:46.374817 6913 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0313 11:49:46.374885 6913 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:49:46.374926 6913 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:49:46.375025 6913 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.797407 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.810761 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.824717 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.838475 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.857417 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.866462 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.866505 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.866517 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.866535 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.866549 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:00Z","lastTransitionTime":"2026-03-13T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.875233 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.968174 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.968214 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.968223 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.968242 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.968252 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:00Z","lastTransitionTime":"2026-03-13T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.048155 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.048217 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.048307 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:01 crc kubenswrapper[4837]: E0313 11:50:01.048486 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.048782 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:01 crc kubenswrapper[4837]: E0313 11:50:01.048876 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:01 crc kubenswrapper[4837]: E0313 11:50:01.049300 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:01 crc kubenswrapper[4837]: E0313 11:50:01.049396 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.071183 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.071232 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.071244 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.071261 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.071271 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:01Z","lastTransitionTime":"2026-03-13T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.174666 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.174714 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.174730 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.174750 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.174765 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:01Z","lastTransitionTime":"2026-03-13T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.278169 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.278203 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.278215 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.278231 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.278242 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:01Z","lastTransitionTime":"2026-03-13T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.380162 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.380214 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.380230 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.380253 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.380268 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:01Z","lastTransitionTime":"2026-03-13T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.483105 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.483192 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.483210 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.483235 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.483252 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:01Z","lastTransitionTime":"2026-03-13T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.586346 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.586389 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.586400 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.586421 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.586432 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:01Z","lastTransitionTime":"2026-03-13T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.627019 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/2.log" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.627947 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/1.log" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.632601 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7" exitCode=1 Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.632659 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7"} Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.632714 4837 scope.go:117] "RemoveContainer" containerID="e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.634715 4837 scope.go:117] "RemoveContainer" containerID="7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7" Mar 13 11:50:01 crc kubenswrapper[4837]: E0313 11:50:01.635349 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.660397 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.689411 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.689470 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.689490 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.689515 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.689535 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:01Z","lastTransitionTime":"2026-03-13T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.691330 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.705340 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.720304 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.734893 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.751163 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.765028 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.781783 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.792175 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.792348 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.792406 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.792470 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.792528 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:01Z","lastTransitionTime":"2026-03-13T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.802435 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.816113 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.838664 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.856558 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.870218 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.881532 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.895065 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.895099 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.895107 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.895121 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.895131 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:01Z","lastTransitionTime":"2026-03-13T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.895433 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.908662 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.930082 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\":160\\\\nI0313 11:49:46.362104 6913 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:46.362220 6913 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 11:49:46.362340 6913 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:49:46.362469 6913 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:49:46.362889 6913 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:46.374778 6913 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0313 11:49:46.374817 6913 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0313 11:49:46.374885 6913 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:49:46.374926 6913 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:49:46.375025 6913 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:00Z\\\",\\\"message\\\":\\\":00.918999 7126 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:50:00.919044 7126 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 11:50:00.919014 7126 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:00.919071 7126 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:50:00.919092 7126 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:50:00.919110 7126 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:50:00.919107 7126 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:00.919127 7126 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:50:00.919146 7126 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:50:00.919153 7126 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:00.919178 7126 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:50:00.919162 7126 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:50:00.919211 7126 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:00.919218 7126 factory.go:656] Stopping watch factory\\\\nI0313 11:50:00.919227 7126 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 11:50:00.919233 7126 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.998803 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.998838 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.998846 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.998860 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.998870 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:01Z","lastTransitionTime":"2026-03-13T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.101452 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.102079 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.102195 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.102295 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.102380 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:02Z","lastTransitionTime":"2026-03-13T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.205171 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.205233 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.205298 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.205327 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.205348 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:02Z","lastTransitionTime":"2026-03-13T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.308259 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.308313 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.308324 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.308343 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.308356 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:02Z","lastTransitionTime":"2026-03-13T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.411362 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.411418 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.411435 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.411459 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.411471 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:02Z","lastTransitionTime":"2026-03-13T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.514060 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.514098 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.514108 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.514124 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.514133 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:02Z","lastTransitionTime":"2026-03-13T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.616768 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.616813 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.616827 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.616848 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.616862 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:02Z","lastTransitionTime":"2026-03-13T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.637466 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/2.log" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.641014 4837 scope.go:117] "RemoveContainer" containerID="7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7" Mar 13 11:50:02 crc kubenswrapper[4837]: E0313 11:50:02.641191 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.655695 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.670253 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.684280 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.698595 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.711495 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.719370 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.719416 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.719425 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.719439 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.719449 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:02Z","lastTransitionTime":"2026-03-13T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.723965 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.737702 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.754859 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.775008 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:00Z\\\",\\\"message\\\":\\\":00.918999 7126 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:50:00.919044 7126 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 11:50:00.919014 7126 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:00.919071 7126 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:50:00.919092 7126 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:50:00.919110 7126 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:50:00.919107 7126 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:00.919127 7126 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:50:00.919146 7126 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:50:00.919153 7126 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:00.919178 7126 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:50:00.919162 7126 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:50:00.919211 7126 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:00.919218 7126 factory.go:656] Stopping watch factory\\\\nI0313 11:50:00.919227 7126 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 11:50:00.919233 7126 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.797898 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.814228 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.825621 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.825680 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.825688 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.825703 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.825712 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:02Z","lastTransitionTime":"2026-03-13T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.835433 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.853222 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.872553 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.887741 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.903405 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.917871 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.928790 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.928844 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.928854 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.928870 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.928881 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:02Z","lastTransitionTime":"2026-03-13T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.031359 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.031408 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.031422 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.031441 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.031453 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:03Z","lastTransitionTime":"2026-03-13T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.050185 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:03 crc kubenswrapper[4837]: E0313 11:50:03.050310 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.050730 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:03 crc kubenswrapper[4837]: E0313 11:50:03.050798 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.050846 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:03 crc kubenswrapper[4837]: E0313 11:50:03.050892 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.050938 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:03 crc kubenswrapper[4837]: E0313 11:50:03.050990 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.134614 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.134694 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.134719 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.134746 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.134767 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:03Z","lastTransitionTime":"2026-03-13T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.237130 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.237180 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.237191 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.237209 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.237220 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:03Z","lastTransitionTime":"2026-03-13T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.340178 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.340238 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.340251 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.340268 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.340280 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:03Z","lastTransitionTime":"2026-03-13T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.443381 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.443424 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.443432 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.443449 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.443458 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:03Z","lastTransitionTime":"2026-03-13T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.545471 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.545517 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.545528 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.545545 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.545556 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:03Z","lastTransitionTime":"2026-03-13T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.646937 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.646973 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.646985 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.647000 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.647011 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:03Z","lastTransitionTime":"2026-03-13T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.749732 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.749789 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.749799 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.749818 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.749831 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:03Z","lastTransitionTime":"2026-03-13T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.852271 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.852310 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.852318 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.852332 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.852341 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:03Z","lastTransitionTime":"2026-03-13T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.954402 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.954444 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.954456 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.954473 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.954483 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:03Z","lastTransitionTime":"2026-03-13T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.057515 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.057603 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.057616 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.057633 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.057663 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:04Z","lastTransitionTime":"2026-03-13T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.159671 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.159721 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.159733 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.159759 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.159771 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:04Z","lastTransitionTime":"2026-03-13T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.263671 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.263731 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.263746 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.263771 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.263789 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:04Z","lastTransitionTime":"2026-03-13T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.367613 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.367697 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.367708 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.367727 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.367740 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:04Z","lastTransitionTime":"2026-03-13T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.470719 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.470763 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.470773 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.470789 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.470799 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:04Z","lastTransitionTime":"2026-03-13T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.573924 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.573992 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.574003 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.574021 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.574035 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:04Z","lastTransitionTime":"2026-03-13T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.677630 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.677731 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.677744 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.677768 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.677781 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:04Z","lastTransitionTime":"2026-03-13T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.781234 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.781310 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.781359 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.781383 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.781398 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:04Z","lastTransitionTime":"2026-03-13T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.885277 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.885342 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.885351 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.885369 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.885380 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:04Z","lastTransitionTime":"2026-03-13T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:04 crc kubenswrapper[4837]: E0313 11:50:04.986286 4837 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.047946 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:05 crc kubenswrapper[4837]: E0313 11:50:05.048135 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.048257 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:05 crc kubenswrapper[4837]: E0313 11:50:05.048393 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.048469 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:05 crc kubenswrapper[4837]: E0313 11:50:05.048528 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.048596 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:05 crc kubenswrapper[4837]: E0313 11:50:05.048677 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.064528 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.078982 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.093233 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.110662 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.124958 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.137516 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.152542 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: E0313 11:50:05.158115 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.169749 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.185675 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.200025 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.215815 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.233749 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.251224 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.272057 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:00Z\\\",\\\"message\\\":\\\":00.918999 7126 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:50:00.919044 7126 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 11:50:00.919014 7126 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:00.919071 7126 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:50:00.919092 7126 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:50:00.919110 7126 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:50:00.919107 7126 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:00.919127 7126 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:50:00.919146 7126 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:50:00.919153 7126 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:00.919178 7126 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:50:00.919162 7126 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:50:00.919211 7126 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:00.919218 7126 factory.go:656] Stopping watch factory\\\\nI0313 11:50:00.919227 7126 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 11:50:00.919233 7126 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.291829 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.307161 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.320608 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.383779 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.401246 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.421089 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.439531 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.454679 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.469722 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.482059 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.496358 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.511956 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.525730 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.537576 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.551314 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.578217 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:00Z\\\",\\\"message\\\":\\\":00.918999 7126 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:50:00.919044 7126 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 11:50:00.919014 7126 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:00.919071 7126 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:50:00.919092 7126 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:50:00.919110 7126 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:50:00.919107 7126 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:00.919127 7126 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:50:00.919146 7126 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:50:00.919153 7126 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:00.919178 7126 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:50:00.919162 7126 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:50:00.919211 7126 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:00.919218 7126 factory.go:656] Stopping watch factory\\\\nI0313 11:50:00.919227 7126 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 11:50:00.919233 7126 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.596083 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.607082 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.618690 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.632345 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.648745 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.688077 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.688122 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.688131 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.688146 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.688155 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:06Z","lastTransitionTime":"2026-03-13T11:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:06 crc kubenswrapper[4837]: E0313 11:50:06.702211 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.706226 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.706261 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.706270 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.706286 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.706295 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:06Z","lastTransitionTime":"2026-03-13T11:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:06 crc kubenswrapper[4837]: E0313 11:50:06.718871 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.723712 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.723746 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.723754 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.723769 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.723778 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:06Z","lastTransitionTime":"2026-03-13T11:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:06 crc kubenswrapper[4837]: E0313 11:50:06.738106 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.744090 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.744123 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.744132 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.744146 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.744156 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:06Z","lastTransitionTime":"2026-03-13T11:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:06 crc kubenswrapper[4837]: E0313 11:50:06.763700 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.768206 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.768245 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.768253 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.768270 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.768280 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:06Z","lastTransitionTime":"2026-03-13T11:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:06 crc kubenswrapper[4837]: E0313 11:50:06.780021 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: E0313 11:50:06.780135 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:50:07 crc kubenswrapper[4837]: I0313 11:50:07.008200 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:07 crc kubenswrapper[4837]: I0313 11:50:07.008333 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:07 crc kubenswrapper[4837]: I0313 11:50:07.008367 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:07 crc kubenswrapper[4837]: I0313 11:50:07.008409 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:07 crc kubenswrapper[4837]: I0313 11:50:07.008425 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008467 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:39.008432331 +0000 UTC m=+154.646699134 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008538 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008551 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008598 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:50:39.008583704 +0000 UTC m=+154.646850467 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008613 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:50:39.008607475 +0000 UTC m=+154.646874238 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008615 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008666 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008684 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008619 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008759 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008773 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008723 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 11:50:39.008708648 +0000 UTC m=+154.646975631 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008851 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 11:50:39.008833302 +0000 UTC m=+154.647100065 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:50:07 crc kubenswrapper[4837]: I0313 11:50:07.047835 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:07 crc kubenswrapper[4837]: I0313 11:50:07.047835 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.047978 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:07 crc kubenswrapper[4837]: I0313 11:50:07.048000 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:07 crc kubenswrapper[4837]: I0313 11:50:07.047983 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.048073 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.048224 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.048331 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:07 crc kubenswrapper[4837]: I0313 11:50:07.109141 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.109341 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.109417 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs podName:86e5afeb-4720-4593-a53e-dfb5381d0b1d nodeName:}" failed. No retries permitted until 2026-03-13 11:50:39.109399029 +0000 UTC m=+154.747665782 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs") pod "network-metrics-daemon-cjn4q" (UID: "86e5afeb-4720-4593-a53e-dfb5381d0b1d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:50:09 crc kubenswrapper[4837]: I0313 11:50:09.047963 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:09 crc kubenswrapper[4837]: E0313 11:50:09.048374 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:09 crc kubenswrapper[4837]: I0313 11:50:09.048099 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:09 crc kubenswrapper[4837]: I0313 11:50:09.048123 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:09 crc kubenswrapper[4837]: E0313 11:50:09.048552 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:09 crc kubenswrapper[4837]: E0313 11:50:09.048601 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:09 crc kubenswrapper[4837]: I0313 11:50:09.048125 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:09 crc kubenswrapper[4837]: E0313 11:50:09.048709 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:10 crc kubenswrapper[4837]: E0313 11:50:10.159280 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:11 crc kubenswrapper[4837]: I0313 11:50:11.048066 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:11 crc kubenswrapper[4837]: E0313 11:50:11.048226 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:11 crc kubenswrapper[4837]: I0313 11:50:11.048219 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:11 crc kubenswrapper[4837]: I0313 11:50:11.048282 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:11 crc kubenswrapper[4837]: I0313 11:50:11.048372 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:11 crc kubenswrapper[4837]: E0313 11:50:11.048313 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:11 crc kubenswrapper[4837]: E0313 11:50:11.048467 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:11 crc kubenswrapper[4837]: E0313 11:50:11.048541 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:13 crc kubenswrapper[4837]: I0313 11:50:13.047515 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:13 crc kubenswrapper[4837]: I0313 11:50:13.047573 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:13 crc kubenswrapper[4837]: I0313 11:50:13.047676 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:13 crc kubenswrapper[4837]: I0313 11:50:13.047783 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:13 crc kubenswrapper[4837]: E0313 11:50:13.047783 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:13 crc kubenswrapper[4837]: E0313 11:50:13.047987 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:13 crc kubenswrapper[4837]: E0313 11:50:13.048061 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:13 crc kubenswrapper[4837]: E0313 11:50:13.048323 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:14 crc kubenswrapper[4837]: I0313 11:50:14.048458 4837 scope.go:117] "RemoveContainer" containerID="7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7" Mar 13 11:50:14 crc kubenswrapper[4837]: E0313 11:50:14.048767 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.047512 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.047580 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:15 crc kubenswrapper[4837]: E0313 11:50:15.047724 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.047546 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.047766 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:15 crc kubenswrapper[4837]: E0313 11:50:15.047895 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:15 crc kubenswrapper[4837]: E0313 11:50:15.048064 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:15 crc kubenswrapper[4837]: E0313 11:50:15.048132 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.067880 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.088697 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.104256 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.117132 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.131042 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.149600 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:00Z\\\",\\\"message\\\":\\\":00.918999 7126 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:50:00.919044 7126 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 11:50:00.919014 7126 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:00.919071 7126 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:50:00.919092 7126 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:50:00.919110 7126 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:50:00.919107 7126 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:00.919127 7126 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:50:00.919146 7126 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:50:00.919153 7126 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:00.919178 7126 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:50:00.919162 7126 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:50:00.919211 7126 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:00.919218 7126 factory.go:656] Stopping watch factory\\\\nI0313 11:50:00.919227 7126 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 11:50:00.919233 7126 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: E0313 11:50:15.159724 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.169847 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.184603 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.199708 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.211934 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.224353 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.235972 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.249581 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.263256 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.274983 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.288666 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.299555 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.838608 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.838712 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.838731 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.838758 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.838775 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:16Z","lastTransitionTime":"2026-03-13T11:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:16 crc kubenswrapper[4837]: E0313 11:50:16.860391 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:16Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.865777 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.865840 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.865863 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.865891 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.865913 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:16Z","lastTransitionTime":"2026-03-13T11:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:16 crc kubenswrapper[4837]: E0313 11:50:16.886839 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:16Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.891356 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.891421 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.891431 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.891448 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.891459 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:16Z","lastTransitionTime":"2026-03-13T11:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:16 crc kubenswrapper[4837]: E0313 11:50:16.904376 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:16Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.908414 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.908468 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.908479 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.908495 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.908505 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:16Z","lastTransitionTime":"2026-03-13T11:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:16 crc kubenswrapper[4837]: E0313 11:50:16.924312 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:16Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.928682 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.928715 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.928727 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.928742 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.928752 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:16Z","lastTransitionTime":"2026-03-13T11:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:16 crc kubenswrapper[4837]: E0313 11:50:16.940634 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:16Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:16 crc kubenswrapper[4837]: E0313 11:50:16.940762 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:50:17 crc kubenswrapper[4837]: I0313 11:50:17.048061 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:17 crc kubenswrapper[4837]: I0313 11:50:17.048096 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:17 crc kubenswrapper[4837]: I0313 11:50:17.048145 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:17 crc kubenswrapper[4837]: I0313 11:50:17.048069 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:17 crc kubenswrapper[4837]: E0313 11:50:17.048190 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:17 crc kubenswrapper[4837]: E0313 11:50:17.048313 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:17 crc kubenswrapper[4837]: E0313 11:50:17.048442 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:17 crc kubenswrapper[4837]: E0313 11:50:17.048607 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:19 crc kubenswrapper[4837]: I0313 11:50:19.047936 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:19 crc kubenswrapper[4837]: I0313 11:50:19.047940 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:19 crc kubenswrapper[4837]: I0313 11:50:19.047997 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:19 crc kubenswrapper[4837]: I0313 11:50:19.048026 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:19 crc kubenswrapper[4837]: E0313 11:50:19.048874 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:19 crc kubenswrapper[4837]: E0313 11:50:19.048977 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:19 crc kubenswrapper[4837]: E0313 11:50:19.049073 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:19 crc kubenswrapper[4837]: E0313 11:50:19.049144 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:20 crc kubenswrapper[4837]: E0313 11:50:20.161430 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:21 crc kubenswrapper[4837]: I0313 11:50:21.047964 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:21 crc kubenswrapper[4837]: I0313 11:50:21.048024 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:21 crc kubenswrapper[4837]: I0313 11:50:21.048084 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:21 crc kubenswrapper[4837]: I0313 11:50:21.047962 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:21 crc kubenswrapper[4837]: E0313 11:50:21.048142 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:21 crc kubenswrapper[4837]: E0313 11:50:21.048246 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:21 crc kubenswrapper[4837]: E0313 11:50:21.048384 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:21 crc kubenswrapper[4837]: E0313 11:50:21.048752 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.048190 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.048218 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.048270 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:23 crc kubenswrapper[4837]: E0313 11:50:23.048349 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.048452 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:23 crc kubenswrapper[4837]: E0313 11:50:23.048485 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:23 crc kubenswrapper[4837]: E0313 11:50:23.048691 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:23 crc kubenswrapper[4837]: E0313 11:50:23.048800 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.715361 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qg957_cbb3f4c6-a6c5-4059-8beb-04179d70aff5/kube-multus/0.log" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.715415 4837 generic.go:334] "Generic (PLEG): container finished" podID="cbb3f4c6-a6c5-4059-8beb-04179d70aff5" containerID="9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27" exitCode=1 Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.715451 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qg957" event={"ID":"cbb3f4c6-a6c5-4059-8beb-04179d70aff5","Type":"ContainerDied","Data":"9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27"} Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.715930 4837 scope.go:117] "RemoveContainer" containerID="9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.734115 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.747734 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.760281 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.774288 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.791096 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.805880 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.819751 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.835283 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:22Z\\\",\\\"message\\\":\\\"2026-03-13T11:49:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124\\\\n2026-03-13T11:49:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124 to /host/opt/cni/bin/\\\\n2026-03-13T11:49:37Z [verbose] multus-daemon started\\\\n2026-03-13T11:49:37Z [verbose] Readiness Indicator file check\\\\n2026-03-13T11:50:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.854548 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:00Z\\\",\\\"message\\\":\\\":00.918999 7126 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:50:00.919044 7126 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 11:50:00.919014 7126 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:00.919071 7126 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:50:00.919092 7126 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:50:00.919110 7126 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:50:00.919107 7126 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:00.919127 7126 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:50:00.919146 7126 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:50:00.919153 7126 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:00.919178 7126 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:50:00.919162 7126 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:50:00.919211 7126 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:00.919218 7126 factory.go:656] Stopping watch factory\\\\nI0313 11:50:00.919227 7126 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 11:50:00.919233 7126 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.874933 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.887450 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.902856 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.915976 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.931714 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.943695 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.956818 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.968617 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.721151 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qg957_cbb3f4c6-a6c5-4059-8beb-04179d70aff5/kube-multus/0.log" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.721230 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qg957" event={"ID":"cbb3f4c6-a6c5-4059-8beb-04179d70aff5","Type":"ContainerStarted","Data":"19b8a72f10c691a74098997e9d2383adf1aeb1811ad22dc8a74b5a47945d1e3e"} Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.742284 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.755556 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.771677 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.784915 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.799672 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.816488 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.835527 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.852203 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.866717 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.885981 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b8a72f10c691a74098997e9d2383adf1aeb1811ad22dc8a74b5a47945d1e3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:22Z\\\",\\\"message\\\":\\\"2026-03-13T11:49:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124\\\\n2026-03-13T11:49:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124 to /host/opt/cni/bin/\\\\n2026-03-13T11:49:37Z [verbose] multus-daemon started\\\\n2026-03-13T11:49:37Z [verbose] Readiness Indicator file check\\\\n2026-03-13T11:50:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:50:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.911498 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:00Z\\\",\\\"message\\\":\\\":00.918999 7126 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:50:00.919044 7126 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 11:50:00.919014 7126 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:00.919071 7126 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:50:00.919092 7126 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:50:00.919110 7126 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:50:00.919107 7126 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:00.919127 7126 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:50:00.919146 7126 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:50:00.919153 7126 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:00.919178 7126 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:50:00.919162 7126 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:50:00.919211 7126 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:00.919218 7126 factory.go:656] Stopping watch factory\\\\nI0313 11:50:00.919227 7126 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 11:50:00.919233 7126 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.937101 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.955817 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.981853 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.001291 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.014899 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.028897 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.048214 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.048304 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.048348 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.048489 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:25 crc kubenswrapper[4837]: E0313 11:50:25.048524 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:25 crc kubenswrapper[4837]: E0313 11:50:25.048613 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:25 crc kubenswrapper[4837]: E0313 11:50:25.048863 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:25 crc kubenswrapper[4837]: E0313 11:50:25.049027 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.065482 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.087846 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.104773 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.118625 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.131246 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.146480 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.161822 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: E0313 11:50:25.161960 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.178286 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.193279 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.206565 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.226945 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:00Z\\\",\\\"message\\\":\\\":00.918999 7126 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:50:00.919044 7126 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 11:50:00.919014 7126 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:00.919071 7126 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:50:00.919092 7126 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:50:00.919110 7126 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:50:00.919107 7126 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:00.919127 7126 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:50:00.919146 7126 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:50:00.919153 7126 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:00.919178 7126 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:50:00.919162 7126 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:50:00.919211 7126 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:00.919218 7126 factory.go:656] Stopping watch factory\\\\nI0313 11:50:00.919227 7126 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 11:50:00.919233 7126 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.247180 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.259376 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.272930 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.283875 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.296114 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.309838 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b8a72f10c691a74098997e9d2383adf1aeb1811ad22dc8a74b5a47945d1e3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:22Z\\\",\\\"message\\\":\\\"2026-03-13T11:49:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124\\\\n2026-03-13T11:49:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124 to /host/opt/cni/bin/\\\\n2026-03-13T11:49:37Z [verbose] multus-daemon started\\\\n2026-03-13T11:49:37Z [verbose] Readiness Indicator file check\\\\n2026-03-13T11:50:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:50:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.047863 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.047922 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.047936 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:27 crc kubenswrapper[4837]: E0313 11:50:27.048037 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.048172 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:27 crc kubenswrapper[4837]: E0313 11:50:27.048266 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:27 crc kubenswrapper[4837]: E0313 11:50:27.048362 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:27 crc kubenswrapper[4837]: E0313 11:50:27.048728 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.068436 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.069330 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.069381 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.069399 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.069422 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.069440 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:27Z","lastTransitionTime":"2026-03-13T11:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:27 crc kubenswrapper[4837]: E0313 11:50:27.094335 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.100698 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.100750 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.100763 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.100784 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.100797 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:27Z","lastTransitionTime":"2026-03-13T11:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:27 crc kubenswrapper[4837]: E0313 11:50:27.118091 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.122787 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.122852 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.122865 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.122882 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.122894 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:27Z","lastTransitionTime":"2026-03-13T11:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:27 crc kubenswrapper[4837]: E0313 11:50:27.167326 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.175823 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.176120 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.176228 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.176360 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.176448 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:27Z","lastTransitionTime":"2026-03-13T11:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:27 crc kubenswrapper[4837]: E0313 11:50:27.195931 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.199826 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.199861 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.199871 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.199885 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.199894 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:27Z","lastTransitionTime":"2026-03-13T11:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:27 crc kubenswrapper[4837]: E0313 11:50:27.216432 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:27 crc kubenswrapper[4837]: E0313 11:50:27.216590 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.048348 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.048425 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.048519 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.048566 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.049559 4837 scope.go:117] "RemoveContainer" containerID="7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7" Mar 13 11:50:29 crc kubenswrapper[4837]: E0313 11:50:29.049812 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:29 crc kubenswrapper[4837]: E0313 11:50:29.049904 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:29 crc kubenswrapper[4837]: E0313 11:50:29.049971 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:29 crc kubenswrapper[4837]: E0313 11:50:29.050104 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.738455 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/2.log" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.741148 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerStarted","Data":"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c"} Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.741582 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.754984 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.767447 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.782443 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b8a72f10c691a74098997e9d2383adf1aeb1811ad22dc8a74b5a47945d1e3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:22Z\\\",\\\"message\\\":\\\"2026-03-13T11:49:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124\\\\n2026-03-13T11:49:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124 to /host/opt/cni/bin/\\\\n2026-03-13T11:49:37Z [verbose] multus-daemon started\\\\n2026-03-13T11:49:37Z [verbose] Readiness Indicator file check\\\\n2026-03-13T11:50:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:50:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.805208 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:00Z\\\",\\\"message\\\":\\\":00.918999 7126 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:50:00.919044 7126 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 11:50:00.919014 7126 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:00.919071 7126 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:50:00.919092 7126 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:50:00.919110 7126 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:50:00.919107 7126 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:00.919127 7126 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:50:00.919146 7126 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:50:00.919153 7126 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:00.919178 7126 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:50:00.919162 7126 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:50:00.919211 7126 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:00.919218 7126 factory.go:656] Stopping watch factory\\\\nI0313 11:50:00.919227 7126 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 11:50:00.919233 7126 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:50:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.827868 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.840323 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.856084 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.873782 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.890772 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.906218 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.921987 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.936434 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.953010 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.972536 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.986463 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.001342 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4513a0ad-4bd6-4aec-bce8-cb6337db1d57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ace7b4b1c79d13e8d3fd10baf836c890a60bfbdae807921ae0cc6365bc3dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f1e59b3f4d6931337d42b5716a5ab247f9314e2a0eb400f8fc438c0e1ff95bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f1e59b3f4d6931337d42b5716a5ab247f9314e2a0eb400f8fc438c0e1ff95bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.016695 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.032767 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: E0313 11:50:30.163598 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.747145 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/3.log" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.748111 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/2.log" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.752533 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" exitCode=1 Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.752617 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c"} Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.752793 4837 scope.go:117] "RemoveContainer" containerID="7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.753581 4837 scope.go:117] "RemoveContainer" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" Mar 13 11:50:30 crc kubenswrapper[4837]: E0313 11:50:30.753837 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.771791 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.790150 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.811135 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.826417 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.843265 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.856146 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4513a0ad-4bd6-4aec-bce8-cb6337db1d57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ace7b4b1c79d13e8d3fd10baf836c890a60bfbdae807921ae0cc6365bc3dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f1e59b3f4d6931337d42b5716a5ab247f9314e2a0eb400f8fc438c0e1ff95bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f1e59b3f4d6931337d42b5716a5ab247f9314e2a0eb400f8fc438c0e1ff95bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.873034 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.889500 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.904860 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.926024 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.940252 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.966918 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:00Z\\\",\\\"message\\\":\\\":00.918999 7126 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:50:00.919044 7126 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 11:50:00.919014 7126 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:00.919071 7126 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:50:00.919092 7126 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:50:00.919110 7126 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:50:00.919107 7126 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:00.919127 7126 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:50:00.919146 7126 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:50:00.919153 7126 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:00.919178 7126 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:50:00.919162 7126 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:50:00.919211 7126 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:00.919218 7126 factory.go:656] Stopping watch factory\\\\nI0313 11:50:00.919227 7126 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 11:50:00.919233 7126 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:29Z\\\",\\\"message\\\":\\\"ormers/factory.go:160\\\\nI0313 11:50:29.836038 7458 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:50:29.836680 7458 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:50:29.837264 7458 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:50:29.845714 7458 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:29.845841 7458 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:29.845894 7458 factory.go:656] Stopping watch factory\\\\nI0313 11:50:29.845925 7458 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:29.845982 7458 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:29.852851 7458 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0313 11:50:29.852888 7458 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0313 11:50:29.852950 7458 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:50:29.852977 7458 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:50:29.853057 7458 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.991708 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.009588 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.027077 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.044822 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.047874 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.047901 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.047941 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.047981 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:31 crc kubenswrapper[4837]: E0313 11:50:31.048071 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:31 crc kubenswrapper[4837]: E0313 11:50:31.048170 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:31 crc kubenswrapper[4837]: E0313 11:50:31.048285 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:31 crc kubenswrapper[4837]: E0313 11:50:31.048385 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.062578 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.078156 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b8a72f10c691a74098997e9d2383adf1aeb1811ad22dc8a74b5a47945d1e3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:22Z\\\",\\\"message\\\":\\\"2026-03-13T11:49:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124\\\\n2026-03-13T11:49:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124 to /host/opt/cni/bin/\\\\n2026-03-13T11:49:37Z [verbose] multus-daemon started\\\\n2026-03-13T11:49:37Z [verbose] Readiness Indicator file check\\\\n2026-03-13T11:50:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:50:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.759090 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/3.log" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.768393 4837 scope.go:117] "RemoveContainer" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" Mar 13 11:50:31 crc kubenswrapper[4837]: E0313 11:50:31.769433 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.783827 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.800672 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.813110 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.828391 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.842952 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.858564 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4513a0ad-4bd6-4aec-bce8-cb6337db1d57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ace7b4b1c79d13e8d3fd10baf836c890a60bfbdae807921ae0cc6365bc3dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f1e59b3f4d6931337d42b5716a5ab247f9314e2a0eb400f8fc438c0e1ff95bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f1e59b3f4d6931337d42b5716a5ab247f9314e2a0eb400f8fc438c0e1ff95bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.874584 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.890935 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.907377 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.921178 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.936029 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b8a72f10c691a74098997e9d2383adf1aeb1811ad22dc8a74b5a47945d1e3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:22Z\\\",\\\"message\\\":\\\"2026-03-13T11:49:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124\\\\n2026-03-13T11:49:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124 to /host/opt/cni/bin/\\\\n2026-03-13T11:49:37Z [verbose] multus-daemon started\\\\n2026-03-13T11:49:37Z [verbose] Readiness Indicator file check\\\\n2026-03-13T11:50:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:50:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.959335 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:29Z\\\",\\\"message\\\":\\\"ormers/factory.go:160\\\\nI0313 11:50:29.836038 7458 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:50:29.836680 7458 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:50:29.837264 7458 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:50:29.845714 7458 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:29.845841 7458 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:29.845894 7458 factory.go:656] Stopping watch factory\\\\nI0313 11:50:29.845925 7458 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:29.845982 7458 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:29.852851 7458 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0313 11:50:29.852888 7458 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0313 11:50:29.852950 7458 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:50:29.852977 7458 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:50:29.853057 7458 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.981027 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.998034 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:32 crc kubenswrapper[4837]: I0313 11:50:32.015364 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:32Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:32 crc kubenswrapper[4837]: I0313 11:50:32.029731 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:32Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:32 crc kubenswrapper[4837]: I0313 11:50:32.046472 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:32Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:32 crc kubenswrapper[4837]: I0313 11:50:32.066496 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:32Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:33 crc kubenswrapper[4837]: I0313 11:50:33.047511 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:33 crc kubenswrapper[4837]: I0313 11:50:33.047590 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:33 crc kubenswrapper[4837]: E0313 11:50:33.047665 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:33 crc kubenswrapper[4837]: I0313 11:50:33.047700 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:33 crc kubenswrapper[4837]: E0313 11:50:33.047724 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:33 crc kubenswrapper[4837]: E0313 11:50:33.047837 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:33 crc kubenswrapper[4837]: I0313 11:50:33.047938 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:33 crc kubenswrapper[4837]: E0313 11:50:33.048001 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.047592 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.047596 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:35 crc kubenswrapper[4837]: E0313 11:50:35.048021 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.047706 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:35 crc kubenswrapper[4837]: E0313 11:50:35.048075 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.047629 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:35 crc kubenswrapper[4837]: E0313 11:50:35.048262 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:35 crc kubenswrapper[4837]: E0313 11:50:35.048306 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.061276 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.071850 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.081799 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.093047 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.104935 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.116815 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4513a0ad-4bd6-4aec-bce8-cb6337db1d57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ace7b4b1c79d13e8d3fd10baf836c890a60bfbdae807921ae0cc6365bc3dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f1e59b3f4d6931337d42b5716a5ab247f9314e2a0eb400f8fc438c0e1ff95bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f1e59b3f4d6931337d42b5716a5ab247f9314e2a0eb400f8fc438c0e1ff95bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.132492 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.143844 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.159952 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: E0313 11:50:35.164299 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.173544 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.187163 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b8a72f10c691a74098997e9d2383adf1aeb1811ad22dc8a74b5a47945d1e3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:22Z\\\",\\\"message\\\":\\\"2026-03-13T11:49:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124\\\\n2026-03-13T11:49:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124 to /host/opt/cni/bin/\\\\n2026-03-13T11:49:37Z [verbose] multus-daemon started\\\\n2026-03-13T11:49:37Z [verbose] Readiness Indicator file check\\\\n2026-03-13T11:50:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:50:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.206884 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:29Z\\\",\\\"message\\\":\\\"ormers/factory.go:160\\\\nI0313 11:50:29.836038 7458 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:50:29.836680 7458 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:50:29.837264 7458 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:50:29.845714 7458 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:29.845841 7458 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:29.845894 7458 factory.go:656] Stopping watch factory\\\\nI0313 11:50:29.845925 7458 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:29.845982 7458 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:29.852851 7458 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0313 11:50:29.852888 7458 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0313 11:50:29.852950 7458 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:50:29.852977 7458 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:50:29.853057 7458 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.231596 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.247151 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.263397 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.276894 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.289956 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.304901 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.047427 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.047431 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:37 crc kubenswrapper[4837]: E0313 11:50:37.047660 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:37 crc kubenswrapper[4837]: E0313 11:50:37.047753 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.047480 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:37 crc kubenswrapper[4837]: E0313 11:50:37.047860 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.047955 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:37 crc kubenswrapper[4837]: E0313 11:50:37.048021 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.509889 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.509932 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.509941 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.509957 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.509968 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:37Z","lastTransitionTime":"2026-03-13T11:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:37 crc kubenswrapper[4837]: E0313 11:50:37.521533 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.524974 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.525009 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.525018 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.525035 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.525044 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:37Z","lastTransitionTime":"2026-03-13T11:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:37 crc kubenswrapper[4837]: E0313 11:50:37.544396 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.548241 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.548289 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.548298 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.548312 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.548321 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:37Z","lastTransitionTime":"2026-03-13T11:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:37 crc kubenswrapper[4837]: E0313 11:50:37.561439 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.565220 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.565252 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.565262 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.565276 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.565285 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:37Z","lastTransitionTime":"2026-03-13T11:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:37 crc kubenswrapper[4837]: E0313 11:50:37.577365 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.581038 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.581060 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.581069 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.581081 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.581090 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:37Z","lastTransitionTime":"2026-03-13T11:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:37 crc kubenswrapper[4837]: E0313 11:50:37.593264 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:37 crc kubenswrapper[4837]: E0313 11:50:37.593599 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:50:39 crc kubenswrapper[4837]: I0313 11:50:39.038197 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:39 crc kubenswrapper[4837]: I0313 11:50:39.038277 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:39 crc kubenswrapper[4837]: I0313 11:50:39.038300 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038328 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:43.038303687 +0000 UTC m=+218.676570470 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:39 crc kubenswrapper[4837]: I0313 11:50:39.038387 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038413 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038430 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:50:39 crc kubenswrapper[4837]: I0313 11:50:39.038428 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038440 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038472 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038509 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038527 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038466 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038545 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038531 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 11:51:43.038516363 +0000 UTC m=+218.676783136 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038658 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:51:43.038620646 +0000 UTC m=+218.676887429 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038690 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:51:43.038676448 +0000 UTC m=+218.676943221 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038715 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 11:51:43.038704779 +0000 UTC m=+218.676971552 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:50:39 crc kubenswrapper[4837]: I0313 11:50:39.047317 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:39 crc kubenswrapper[4837]: I0313 11:50:39.047347 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:39 crc kubenswrapper[4837]: I0313 11:50:39.047368 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:39 crc kubenswrapper[4837]: I0313 11:50:39.047390 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.047460 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.047574 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.047711 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.047773 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:39 crc kubenswrapper[4837]: I0313 11:50:39.138914 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.139092 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.139214 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs podName:86e5afeb-4720-4593-a53e-dfb5381d0b1d nodeName:}" failed. No retries permitted until 2026-03-13 11:51:43.139193559 +0000 UTC m=+218.777460532 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs") pod "network-metrics-daemon-cjn4q" (UID: "86e5afeb-4720-4593-a53e-dfb5381d0b1d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:50:40 crc kubenswrapper[4837]: E0313 11:50:40.165864 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:41 crc kubenswrapper[4837]: I0313 11:50:41.047817 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:41 crc kubenswrapper[4837]: I0313 11:50:41.047974 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:41 crc kubenswrapper[4837]: E0313 11:50:41.047976 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:41 crc kubenswrapper[4837]: I0313 11:50:41.048039 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:41 crc kubenswrapper[4837]: I0313 11:50:41.048058 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:41 crc kubenswrapper[4837]: E0313 11:50:41.048198 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:41 crc kubenswrapper[4837]: E0313 11:50:41.048273 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:41 crc kubenswrapper[4837]: E0313 11:50:41.048333 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:43 crc kubenswrapper[4837]: I0313 11:50:43.047726 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:43 crc kubenswrapper[4837]: I0313 11:50:43.047759 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:43 crc kubenswrapper[4837]: I0313 11:50:43.047737 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:43 crc kubenswrapper[4837]: E0313 11:50:43.047863 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:43 crc kubenswrapper[4837]: I0313 11:50:43.047864 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:43 crc kubenswrapper[4837]: E0313 11:50:43.047940 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:43 crc kubenswrapper[4837]: E0313 11:50:43.048053 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:43 crc kubenswrapper[4837]: E0313 11:50:43.048254 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:43 crc kubenswrapper[4837]: I0313 11:50:43.048933 4837 scope.go:117] "RemoveContainer" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" Mar 13 11:50:43 crc kubenswrapper[4837]: E0313 11:50:43.049075 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.048012 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.048979 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.049039 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.049119 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:45 crc kubenswrapper[4837]: E0313 11:50:45.049348 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:45 crc kubenswrapper[4837]: E0313 11:50:45.049468 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:45 crc kubenswrapper[4837]: E0313 11:50:45.049511 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:45 crc kubenswrapper[4837]: E0313 11:50:45.048968 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.061841 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.089044 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=18.089021143 podStartE2EDuration="18.089021143s" podCreationTimestamp="2026-03-13 11:50:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:45.072384479 +0000 UTC m=+160.710651242" watchObservedRunningTime="2026-03-13 11:50:45.089021143 +0000 UTC m=+160.727287936" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.149929 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-np68d" podStartSLOduration=104.149887413 podStartE2EDuration="1m44.149887413s" podCreationTimestamp="2026-03-13 11:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:45.149851163 +0000 UTC m=+160.788117926" watchObservedRunningTime="2026-03-13 11:50:45.149887413 +0000 UTC m=+160.788154196" Mar 13 11:50:45 crc kubenswrapper[4837]: E0313 11:50:45.166693 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.198931 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=57.19890946 podStartE2EDuration="57.19890946s" podCreationTimestamp="2026-03-13 11:49:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:45.181293144 +0000 UTC m=+160.819559907" watchObservedRunningTime="2026-03-13 11:50:45.19890946 +0000 UTC m=+160.837176223" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.211288 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=48.21126448 podStartE2EDuration="48.21126448s" podCreationTimestamp="2026-03-13 11:49:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:45.199186678 +0000 UTC m=+160.837453441" watchObservedRunningTime="2026-03-13 11:50:45.21126448 +0000 UTC m=+160.849531243" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.225658 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podStartSLOduration=103.225603501 podStartE2EDuration="1m43.225603501s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:45.225531039 +0000 UTC m=+160.863797822" watchObservedRunningTime="2026-03-13 11:50:45.225603501 +0000 UTC m=+160.863870284" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.250787 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" podStartSLOduration=103.250768845 podStartE2EDuration="1m43.250768845s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:45.23824136 +0000 UTC m=+160.876508133" watchObservedRunningTime="2026-03-13 11:50:45.250768845 +0000 UTC m=+160.889035608" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.251062 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-qg957" podStartSLOduration=103.251058374 podStartE2EDuration="1m43.251058374s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:45.249994901 +0000 UTC m=+160.888261674" watchObservedRunningTime="2026-03-13 11:50:45.251058374 +0000 UTC m=+160.889325137" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.309540 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" podStartSLOduration=103.309521028 podStartE2EDuration="1m43.309521028s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:45.309267031 +0000 UTC m=+160.947533804" watchObservedRunningTime="2026-03-13 11:50:45.309521028 +0000 UTC m=+160.947787801" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.309774 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=63.309767576 podStartE2EDuration="1m3.309767576s" podCreationTimestamp="2026-03-13 11:49:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:45.286522943 +0000 UTC m=+160.924789706" watchObservedRunningTime="2026-03-13 11:50:45.309767576 +0000 UTC m=+160.948034339" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.338371 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-xwmn9" podStartSLOduration=104.338350478 podStartE2EDuration="1m44.338350478s" podCreationTimestamp="2026-03-13 11:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:45.329461737 +0000 UTC m=+160.967728500" watchObservedRunningTime="2026-03-13 11:50:45.338350478 +0000 UTC m=+160.976617241" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.047876 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.047884 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:47 crc kubenswrapper[4837]: E0313 11:50:47.048090 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.047914 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.047914 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:47 crc kubenswrapper[4837]: E0313 11:50:47.048206 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:47 crc kubenswrapper[4837]: E0313 11:50:47.048265 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:47 crc kubenswrapper[4837]: E0313 11:50:47.048384 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.723103 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.723173 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.723200 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.723228 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.723250 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:47Z","lastTransitionTime":"2026-03-13T11:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.797659 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm"] Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.798092 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.799918 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.799973 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.800002 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.800578 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.927433 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/86283508-cb82-4fca-b672-4c5cd27b8018-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.927474 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86283508-cb82-4fca-b672-4c5cd27b8018-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.927497 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86283508-cb82-4fca-b672-4c5cd27b8018-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.927523 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/86283508-cb82-4fca-b672-4c5cd27b8018-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.927680 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86283508-cb82-4fca-b672-4c5cd27b8018-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.029286 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86283508-cb82-4fca-b672-4c5cd27b8018-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.029366 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/86283508-cb82-4fca-b672-4c5cd27b8018-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.029431 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86283508-cb82-4fca-b672-4c5cd27b8018-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.029456 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86283508-cb82-4fca-b672-4c5cd27b8018-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.029498 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/86283508-cb82-4fca-b672-4c5cd27b8018-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.029564 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/86283508-cb82-4fca-b672-4c5cd27b8018-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.029564 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/86283508-cb82-4fca-b672-4c5cd27b8018-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.030763 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86283508-cb82-4fca-b672-4c5cd27b8018-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.036817 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86283508-cb82-4fca-b672-4c5cd27b8018-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.049493 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86283508-cb82-4fca-b672-4c5cd27b8018-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.111765 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.139558 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.149148 4837 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.813562 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" event={"ID":"86283508-cb82-4fca-b672-4c5cd27b8018","Type":"ContainerStarted","Data":"a436092eb4cf4025a437251be7e13e7be3ca141be88a723720a9c9b6a732bf29"} Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.813608 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" event={"ID":"86283508-cb82-4fca-b672-4c5cd27b8018","Type":"ContainerStarted","Data":"0c8bbf900ab9ff859b13a83e5127257a9dd11f980c2466929bccfbeac6ae5eef"} Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.828620 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=3.828587033 podStartE2EDuration="3.828587033s" podCreationTimestamp="2026-03-13 11:50:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:47.828722777 +0000 UTC m=+163.466989540" watchObservedRunningTime="2026-03-13 11:50:48.828587033 +0000 UTC m=+164.466853836" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.829446 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" podStartSLOduration=107.82943348 podStartE2EDuration="1m47.82943348s" podCreationTimestamp="2026-03-13 11:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:48.827878012 +0000 UTC m=+164.466144805" watchObservedRunningTime="2026-03-13 11:50:48.82943348 +0000 UTC m=+164.467700293" Mar 13 11:50:49 crc kubenswrapper[4837]: I0313 11:50:49.047677 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:49 crc kubenswrapper[4837]: I0313 11:50:49.047677 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:49 crc kubenswrapper[4837]: E0313 11:50:49.047859 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:49 crc kubenswrapper[4837]: I0313 11:50:49.047697 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:49 crc kubenswrapper[4837]: I0313 11:50:49.047697 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:49 crc kubenswrapper[4837]: E0313 11:50:49.047965 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:49 crc kubenswrapper[4837]: E0313 11:50:49.048133 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:49 crc kubenswrapper[4837]: E0313 11:50:49.048246 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:50 crc kubenswrapper[4837]: E0313 11:50:50.167701 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:51 crc kubenswrapper[4837]: I0313 11:50:51.048162 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:51 crc kubenswrapper[4837]: I0313 11:50:51.048168 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:51 crc kubenswrapper[4837]: E0313 11:50:51.048367 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:51 crc kubenswrapper[4837]: I0313 11:50:51.048192 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:51 crc kubenswrapper[4837]: I0313 11:50:51.048169 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:51 crc kubenswrapper[4837]: E0313 11:50:51.048465 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:51 crc kubenswrapper[4837]: E0313 11:50:51.048547 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:51 crc kubenswrapper[4837]: E0313 11:50:51.048618 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:53 crc kubenswrapper[4837]: I0313 11:50:53.047832 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:53 crc kubenswrapper[4837]: I0313 11:50:53.047865 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:53 crc kubenswrapper[4837]: I0313 11:50:53.047878 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:53 crc kubenswrapper[4837]: E0313 11:50:53.047989 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:53 crc kubenswrapper[4837]: I0313 11:50:53.048050 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:53 crc kubenswrapper[4837]: E0313 11:50:53.048196 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:53 crc kubenswrapper[4837]: E0313 11:50:53.048314 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:53 crc kubenswrapper[4837]: E0313 11:50:53.048384 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:54 crc kubenswrapper[4837]: I0313 11:50:54.048849 4837 scope.go:117] "RemoveContainer" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" Mar 13 11:50:54 crc kubenswrapper[4837]: E0313 11:50:54.049128 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" Mar 13 11:50:55 crc kubenswrapper[4837]: I0313 11:50:55.047544 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:55 crc kubenswrapper[4837]: I0313 11:50:55.047606 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:55 crc kubenswrapper[4837]: I0313 11:50:55.047701 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:55 crc kubenswrapper[4837]: I0313 11:50:55.048619 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:55 crc kubenswrapper[4837]: E0313 11:50:55.048613 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:55 crc kubenswrapper[4837]: E0313 11:50:55.048764 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:55 crc kubenswrapper[4837]: E0313 11:50:55.048848 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:55 crc kubenswrapper[4837]: E0313 11:50:55.049135 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:55 crc kubenswrapper[4837]: E0313 11:50:55.168652 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:57 crc kubenswrapper[4837]: I0313 11:50:57.048108 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:57 crc kubenswrapper[4837]: I0313 11:50:57.048168 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:57 crc kubenswrapper[4837]: I0313 11:50:57.048242 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:57 crc kubenswrapper[4837]: I0313 11:50:57.048116 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:57 crc kubenswrapper[4837]: E0313 11:50:57.048293 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:57 crc kubenswrapper[4837]: E0313 11:50:57.048467 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:57 crc kubenswrapper[4837]: E0313 11:50:57.048707 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:57 crc kubenswrapper[4837]: E0313 11:50:57.049292 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:59 crc kubenswrapper[4837]: I0313 11:50:59.047236 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:59 crc kubenswrapper[4837]: I0313 11:50:59.047293 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:59 crc kubenswrapper[4837]: E0313 11:50:59.047420 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:59 crc kubenswrapper[4837]: I0313 11:50:59.047486 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:59 crc kubenswrapper[4837]: I0313 11:50:59.047531 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:59 crc kubenswrapper[4837]: E0313 11:50:59.047618 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:59 crc kubenswrapper[4837]: E0313 11:50:59.047789 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:59 crc kubenswrapper[4837]: E0313 11:50:59.047841 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:00 crc kubenswrapper[4837]: E0313 11:51:00.169806 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:51:01 crc kubenswrapper[4837]: I0313 11:51:01.047611 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:01 crc kubenswrapper[4837]: I0313 11:51:01.047772 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:01 crc kubenswrapper[4837]: E0313 11:51:01.047869 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:01 crc kubenswrapper[4837]: I0313 11:51:01.047780 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:01 crc kubenswrapper[4837]: E0313 11:51:01.048008 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:01 crc kubenswrapper[4837]: E0313 11:51:01.048244 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:01 crc kubenswrapper[4837]: I0313 11:51:01.048592 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:01 crc kubenswrapper[4837]: E0313 11:51:01.048745 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:03 crc kubenswrapper[4837]: I0313 11:51:03.048082 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:03 crc kubenswrapper[4837]: I0313 11:51:03.048144 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:03 crc kubenswrapper[4837]: I0313 11:51:03.048180 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:03 crc kubenswrapper[4837]: I0313 11:51:03.048117 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:03 crc kubenswrapper[4837]: E0313 11:51:03.048236 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:03 crc kubenswrapper[4837]: E0313 11:51:03.048325 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:03 crc kubenswrapper[4837]: E0313 11:51:03.048413 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:03 crc kubenswrapper[4837]: E0313 11:51:03.048536 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:05 crc kubenswrapper[4837]: I0313 11:51:05.048162 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:05 crc kubenswrapper[4837]: I0313 11:51:05.048192 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:05 crc kubenswrapper[4837]: I0313 11:51:05.048232 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:05 crc kubenswrapper[4837]: I0313 11:51:05.048416 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:05 crc kubenswrapper[4837]: E0313 11:51:05.049674 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:05 crc kubenswrapper[4837]: E0313 11:51:05.049760 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:05 crc kubenswrapper[4837]: E0313 11:51:05.049906 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:05 crc kubenswrapper[4837]: E0313 11:51:05.050024 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:05 crc kubenswrapper[4837]: E0313 11:51:05.170323 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:51:07 crc kubenswrapper[4837]: I0313 11:51:07.047747 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:07 crc kubenswrapper[4837]: I0313 11:51:07.047815 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:07 crc kubenswrapper[4837]: I0313 11:51:07.047876 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:07 crc kubenswrapper[4837]: E0313 11:51:07.047977 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:07 crc kubenswrapper[4837]: I0313 11:51:07.048019 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:07 crc kubenswrapper[4837]: E0313 11:51:07.048214 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:07 crc kubenswrapper[4837]: E0313 11:51:07.048528 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:07 crc kubenswrapper[4837]: E0313 11:51:07.048692 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:08 crc kubenswrapper[4837]: I0313 11:51:08.048486 4837 scope.go:117] "RemoveContainer" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" Mar 13 11:51:08 crc kubenswrapper[4837]: E0313 11:51:08.048688 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" Mar 13 11:51:09 crc kubenswrapper[4837]: I0313 11:51:09.047904 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:09 crc kubenswrapper[4837]: I0313 11:51:09.047938 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:09 crc kubenswrapper[4837]: I0313 11:51:09.047981 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:09 crc kubenswrapper[4837]: I0313 11:51:09.048078 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:09 crc kubenswrapper[4837]: E0313 11:51:09.048081 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:09 crc kubenswrapper[4837]: E0313 11:51:09.048199 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:09 crc kubenswrapper[4837]: E0313 11:51:09.048301 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:09 crc kubenswrapper[4837]: E0313 11:51:09.048413 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:09 crc kubenswrapper[4837]: I0313 11:51:09.888841 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qg957_cbb3f4c6-a6c5-4059-8beb-04179d70aff5/kube-multus/1.log" Mar 13 11:51:09 crc kubenswrapper[4837]: I0313 11:51:09.889212 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qg957_cbb3f4c6-a6c5-4059-8beb-04179d70aff5/kube-multus/0.log" Mar 13 11:51:09 crc kubenswrapper[4837]: I0313 11:51:09.889248 4837 generic.go:334] "Generic (PLEG): container finished" podID="cbb3f4c6-a6c5-4059-8beb-04179d70aff5" containerID="19b8a72f10c691a74098997e9d2383adf1aeb1811ad22dc8a74b5a47945d1e3e" exitCode=1 Mar 13 11:51:09 crc kubenswrapper[4837]: I0313 11:51:09.889280 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qg957" event={"ID":"cbb3f4c6-a6c5-4059-8beb-04179d70aff5","Type":"ContainerDied","Data":"19b8a72f10c691a74098997e9d2383adf1aeb1811ad22dc8a74b5a47945d1e3e"} Mar 13 11:51:09 crc kubenswrapper[4837]: I0313 11:51:09.889311 4837 scope.go:117] "RemoveContainer" containerID="9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27" Mar 13 11:51:09 crc kubenswrapper[4837]: I0313 11:51:09.889693 4837 scope.go:117] "RemoveContainer" containerID="19b8a72f10c691a74098997e9d2383adf1aeb1811ad22dc8a74b5a47945d1e3e" Mar 13 11:51:09 crc kubenswrapper[4837]: E0313 11:51:09.889851 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-qg957_openshift-multus(cbb3f4c6-a6c5-4059-8beb-04179d70aff5)\"" pod="openshift-multus/multus-qg957" podUID="cbb3f4c6-a6c5-4059-8beb-04179d70aff5" Mar 13 11:51:10 crc kubenswrapper[4837]: E0313 11:51:10.171514 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:51:10 crc kubenswrapper[4837]: I0313 11:51:10.892946 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qg957_cbb3f4c6-a6c5-4059-8beb-04179d70aff5/kube-multus/1.log" Mar 13 11:51:11 crc kubenswrapper[4837]: I0313 11:51:11.048199 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:11 crc kubenswrapper[4837]: I0313 11:51:11.048200 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:11 crc kubenswrapper[4837]: E0313 11:51:11.048337 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:11 crc kubenswrapper[4837]: I0313 11:51:11.048386 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:11 crc kubenswrapper[4837]: I0313 11:51:11.048429 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:11 crc kubenswrapper[4837]: E0313 11:51:11.048617 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:11 crc kubenswrapper[4837]: E0313 11:51:11.048789 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:11 crc kubenswrapper[4837]: E0313 11:51:11.048891 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:13 crc kubenswrapper[4837]: I0313 11:51:13.047141 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:13 crc kubenswrapper[4837]: I0313 11:51:13.047171 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:13 crc kubenswrapper[4837]: I0313 11:51:13.047193 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:13 crc kubenswrapper[4837]: E0313 11:51:13.047262 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:13 crc kubenswrapper[4837]: I0313 11:51:13.047340 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:13 crc kubenswrapper[4837]: E0313 11:51:13.047402 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:13 crc kubenswrapper[4837]: E0313 11:51:13.047493 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:13 crc kubenswrapper[4837]: E0313 11:51:13.047560 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:15 crc kubenswrapper[4837]: I0313 11:51:15.047391 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:15 crc kubenswrapper[4837]: E0313 11:51:15.049134 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:15 crc kubenswrapper[4837]: I0313 11:51:15.049277 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:15 crc kubenswrapper[4837]: I0313 11:51:15.049385 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:15 crc kubenswrapper[4837]: I0313 11:51:15.049434 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:15 crc kubenswrapper[4837]: E0313 11:51:15.049514 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:15 crc kubenswrapper[4837]: E0313 11:51:15.049805 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:15 crc kubenswrapper[4837]: E0313 11:51:15.050299 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:15 crc kubenswrapper[4837]: E0313 11:51:15.172482 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:51:17 crc kubenswrapper[4837]: I0313 11:51:17.047682 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:17 crc kubenswrapper[4837]: I0313 11:51:17.047715 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:17 crc kubenswrapper[4837]: I0313 11:51:17.047682 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:17 crc kubenswrapper[4837]: I0313 11:51:17.047710 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:17 crc kubenswrapper[4837]: E0313 11:51:17.047815 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:17 crc kubenswrapper[4837]: E0313 11:51:17.047923 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:17 crc kubenswrapper[4837]: E0313 11:51:17.047970 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:17 crc kubenswrapper[4837]: E0313 11:51:17.048129 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:19 crc kubenswrapper[4837]: I0313 11:51:19.047944 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:19 crc kubenswrapper[4837]: I0313 11:51:19.048010 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:19 crc kubenswrapper[4837]: I0313 11:51:19.047964 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:19 crc kubenswrapper[4837]: I0313 11:51:19.048390 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:19 crc kubenswrapper[4837]: E0313 11:51:19.048502 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:19 crc kubenswrapper[4837]: E0313 11:51:19.048851 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:19 crc kubenswrapper[4837]: E0313 11:51:19.048962 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:19 crc kubenswrapper[4837]: E0313 11:51:19.049015 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:19 crc kubenswrapper[4837]: I0313 11:51:19.049032 4837 scope.go:117] "RemoveContainer" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" Mar 13 11:51:19 crc kubenswrapper[4837]: I0313 11:51:19.920742 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/3.log" Mar 13 11:51:19 crc kubenswrapper[4837]: I0313 11:51:19.923404 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerStarted","Data":"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282"} Mar 13 11:51:19 crc kubenswrapper[4837]: I0313 11:51:19.923796 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:51:19 crc kubenswrapper[4837]: I0313 11:51:19.954914 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cjn4q"] Mar 13 11:51:19 crc kubenswrapper[4837]: I0313 11:51:19.955042 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:19 crc kubenswrapper[4837]: E0313 11:51:19.955132 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:19 crc kubenswrapper[4837]: I0313 11:51:19.965114 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podStartSLOduration=137.965097201 podStartE2EDuration="2m17.965097201s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:19.964258284 +0000 UTC m=+195.602525047" watchObservedRunningTime="2026-03-13 11:51:19.965097201 +0000 UTC m=+195.603363964" Mar 13 11:51:20 crc kubenswrapper[4837]: E0313 11:51:20.173817 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:51:21 crc kubenswrapper[4837]: I0313 11:51:21.047151 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:21 crc kubenswrapper[4837]: E0313 11:51:21.047549 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:21 crc kubenswrapper[4837]: I0313 11:51:21.047364 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:21 crc kubenswrapper[4837]: E0313 11:51:21.047698 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:21 crc kubenswrapper[4837]: I0313 11:51:21.047166 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:21 crc kubenswrapper[4837]: I0313 11:51:21.047582 4837 scope.go:117] "RemoveContainer" containerID="19b8a72f10c691a74098997e9d2383adf1aeb1811ad22dc8a74b5a47945d1e3e" Mar 13 11:51:21 crc kubenswrapper[4837]: E0313 11:51:21.047770 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:21 crc kubenswrapper[4837]: I0313 11:51:21.932192 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qg957_cbb3f4c6-a6c5-4059-8beb-04179d70aff5/kube-multus/1.log" Mar 13 11:51:21 crc kubenswrapper[4837]: I0313 11:51:21.932859 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qg957" event={"ID":"cbb3f4c6-a6c5-4059-8beb-04179d70aff5","Type":"ContainerStarted","Data":"1effae1c86d3c4f5369295262f269b1dad692c561321e1c868d2b4fe7f736d7c"} Mar 13 11:51:22 crc kubenswrapper[4837]: I0313 11:51:22.047158 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:22 crc kubenswrapper[4837]: E0313 11:51:22.047288 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:23 crc kubenswrapper[4837]: I0313 11:51:23.047867 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:23 crc kubenswrapper[4837]: I0313 11:51:23.047903 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:23 crc kubenswrapper[4837]: I0313 11:51:23.047960 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:23 crc kubenswrapper[4837]: E0313 11:51:23.047992 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:23 crc kubenswrapper[4837]: E0313 11:51:23.048123 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:23 crc kubenswrapper[4837]: E0313 11:51:23.048179 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:24 crc kubenswrapper[4837]: I0313 11:51:24.047880 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:24 crc kubenswrapper[4837]: E0313 11:51:24.048119 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:25 crc kubenswrapper[4837]: I0313 11:51:25.048096 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:25 crc kubenswrapper[4837]: E0313 11:51:25.049529 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:25 crc kubenswrapper[4837]: I0313 11:51:25.049577 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:25 crc kubenswrapper[4837]: I0313 11:51:25.049587 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:25 crc kubenswrapper[4837]: E0313 11:51:25.049752 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:25 crc kubenswrapper[4837]: E0313 11:51:25.049875 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:26 crc kubenswrapper[4837]: I0313 11:51:26.047575 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:26 crc kubenswrapper[4837]: I0313 11:51:26.050122 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 11:51:26 crc kubenswrapper[4837]: I0313 11:51:26.051893 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 13 11:51:27 crc kubenswrapper[4837]: I0313 11:51:27.048176 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:27 crc kubenswrapper[4837]: I0313 11:51:27.048238 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:27 crc kubenswrapper[4837]: I0313 11:51:27.048176 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:27 crc kubenswrapper[4837]: I0313 11:51:27.050511 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 11:51:27 crc kubenswrapper[4837]: I0313 11:51:27.051931 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 11:51:27 crc kubenswrapper[4837]: I0313 11:51:27.051995 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 13 11:51:27 crc kubenswrapper[4837]: I0313 11:51:27.052110 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.882208 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.935004 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qqkbm"] Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.936257 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.940523 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.940559 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.940547 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.940730 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.940876 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.941135 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.949712 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.954431 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vsp2m"] Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.954994 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.955428 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.955428 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb"] Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.956947 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.958157 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9dbhc"] Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.958933 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.961867 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww"] Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.962523 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.963541 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs"] Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.964188 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.964874 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr"] Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.965522 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.970807 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8q6j6"] Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.971743 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.976436 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.989859 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.990261 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.990671 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.991158 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.991406 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.991652 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.991660 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.992009 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.992276 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.992336 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.992527 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.992617 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-f97pg"] Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.993957 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.001972 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.002304 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.002438 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.004288 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.005735 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-84ccm"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.006024 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-8dj7w"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.006433 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.006656 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-q2qpt"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.007236 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.007400 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.008167 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.008466 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.008688 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.008727 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.008998 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.008773 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.009346 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.009445 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.009453 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.009615 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.009889 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.009743 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.010243 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.008779 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.010484 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.010710 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.008892 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.010996 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.010547 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.009263 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.011546 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.011562 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.009258 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.011683 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.012174 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.015735 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.015976 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.016567 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-8ktsx"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.016886 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.017034 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.017259 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.017627 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8ktsx" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.018016 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.018183 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.018484 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.018659 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.018844 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.018867 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.018969 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.018996 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.019630 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.019937 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.020573 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.021038 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.021620 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.021659 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2w96t"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.022112 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.023300 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.026223 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.026808 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.027353 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.030001 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.030613 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.031147 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-64xpb"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.031692 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-l4rxn"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.031730 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-64xpb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.032049 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.032111 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.032610 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.032926 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.033326 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.035925 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8zzqp"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.036734 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-8zzqp" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.037002 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-9tkxg"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.038075 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.039629 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9dbhc"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.041780 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.042412 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wcfj4"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.042957 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wcfj4" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.043221 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.066372 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.071262 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.075220 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.096344 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.096617 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.103993 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.104858 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.107238 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.107981 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.108342 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.108510 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.108977 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27d45de2-e0ab-4c3e-b3da-b20e60e26801-audit-dir\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109015 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ffb5553f-d2d5-4584-9bf8-7212a378f358-encryption-config\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109041 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a3cabe4-69ee-49f7-a783-e72ac1a56821-config\") pod \"route-controller-manager-6576b87f9c-qs2qs\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109067 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109098 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109125 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/10ac507b-7307-4e09-ab72-b956d0139396-auth-proxy-config\") pod \"machine-approver-56656f9798-dhrww\" (UID: \"10ac507b-7307-4e09-ab72-b956d0139396\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109148 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-config\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109170 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-service-ca-bundle\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109194 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74s9f\" (UniqueName: \"kubernetes.io/projected/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-kube-api-access-74s9f\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109218 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109247 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0-available-featuregates\") pod \"openshift-config-operator-7777fb866f-f97pg\" (UID: \"f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109270 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109290 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ffb5553f-d2d5-4584-9bf8-7212a378f358-audit-policies\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109312 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsdcp\" (UniqueName: \"kubernetes.io/projected/ffb5553f-d2d5-4584-9bf8-7212a378f358-kube-api-access-zsdcp\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109336 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9blhw\" (UniqueName: \"kubernetes.io/projected/003e8201-4e67-4356-b0c1-8cc135451069-kube-api-access-9blhw\") pod \"console-operator-58897d9998-8dj7w\" (UID: \"003e8201-4e67-4356-b0c1-8cc135451069\") " pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109358 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6dmk\" (UniqueName: \"kubernetes.io/projected/6db10103-96be-4420-b302-a7064e347f61-kube-api-access-q6dmk\") pod \"machine-api-operator-5694c8668f-vsp2m\" (UID: \"6db10103-96be-4420-b302-a7064e347f61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109380 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ad2861b-4f40-4551-8aff-304359734792-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dv9wr\" (UID: \"6ad2861b-4f40-4551-8aff-304359734792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109402 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109424 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109443 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-oauth-serving-cert\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109465 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/10ac507b-7307-4e09-ab72-b956d0139396-machine-approver-tls\") pod \"machine-approver-56656f9798-dhrww\" (UID: \"10ac507b-7307-4e09-ab72-b956d0139396\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109487 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffb5553f-d2d5-4584-9bf8-7212a378f358-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109508 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mclx9\" (UniqueName: \"kubernetes.io/projected/6ad2861b-4f40-4551-8aff-304359734792-kube-api-access-mclx9\") pod \"openshift-apiserver-operator-796bbdcf4f-dv9wr\" (UID: \"6ad2861b-4f40-4551-8aff-304359734792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109530 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109567 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6db10103-96be-4420-b302-a7064e347f61-config\") pod \"machine-api-operator-5694c8668f-vsp2m\" (UID: \"6db10103-96be-4420-b302-a7064e347f61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109588 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvs46\" (UniqueName: \"kubernetes.io/projected/f8bc408a-bca6-42ff-8572-2ba9a3978682-kube-api-access-xvs46\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109621 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3e1f747d-78f3-4cbc-b313-eed531936c02-encryption-config\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109673 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109675 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vffzw\" (UniqueName: \"kubernetes.io/projected/3e1f747d-78f3-4cbc-b313-eed531936c02-kube-api-access-vffzw\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109742 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8bc408a-bca6-42ff-8572-2ba9a3978682-serving-cert\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109892 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ffb5553f-d2d5-4584-9bf8-7212a378f358-etcd-client\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109915 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109922 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks5xn\" (UniqueName: \"kubernetes.io/projected/f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0-kube-api-access-ks5xn\") pod \"openshift-config-operator-7777fb866f-f97pg\" (UID: \"f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110015 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-audit\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110053 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-image-import-ca\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110086 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj2pk\" (UniqueName: \"kubernetes.io/projected/5a3cabe4-69ee-49f7-a783-e72ac1a56821-kube-api-access-sj2pk\") pod \"route-controller-manager-6576b87f9c-qs2qs\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110114 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3e1f747d-78f3-4cbc-b313-eed531936c02-node-pullsecrets\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110135 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-config\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110159 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/003e8201-4e67-4356-b0c1-8cc135451069-config\") pod \"console-operator-58897d9998-8dj7w\" (UID: \"003e8201-4e67-4356-b0c1-8cc135451069\") " pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110195 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110226 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a3cabe4-69ee-49f7-a783-e72ac1a56821-client-ca\") pod \"route-controller-manager-6576b87f9c-qs2qs\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110255 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6db10103-96be-4420-b302-a7064e347f61-images\") pod \"machine-api-operator-5694c8668f-vsp2m\" (UID: \"6db10103-96be-4420-b302-a7064e347f61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110280 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-client-ca\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110304 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110325 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0-serving-cert\") pod \"openshift-config-operator-7777fb866f-f97pg\" (UID: \"f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110367 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad2861b-4f40-4551-8aff-304359734792-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dv9wr\" (UID: \"6ad2861b-4f40-4551-8aff-304359734792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110397 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-service-ca\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110444 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ffb5553f-d2d5-4584-9bf8-7212a378f358-audit-dir\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110631 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110672 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110502 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110813 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110855 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ffb5553f-d2d5-4584-9bf8-7212a378f358-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110880 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110909 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/003e8201-4e67-4356-b0c1-8cc135451069-trusted-ca\") pod \"console-operator-58897d9998-8dj7w\" (UID: \"003e8201-4e67-4356-b0c1-8cc135451069\") " pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110934 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e1f747d-78f3-4cbc-b313-eed531936c02-audit-dir\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110957 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3e1f747d-78f3-4cbc-b313-eed531936c02-etcd-client\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110980 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-etcd-serving-ca\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111000 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-config\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111021 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-serving-cert\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111044 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-serving-cert\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111065 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e1f747d-78f3-4cbc-b313-eed531936c02-serving-cert\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111086 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnwqw\" (UniqueName: \"kubernetes.io/projected/10ac507b-7307-4e09-ab72-b956d0139396-kube-api-access-cnwqw\") pod \"machine-approver-56656f9798-dhrww\" (UID: \"10ac507b-7307-4e09-ab72-b956d0139396\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111127 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-audit-policies\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111147 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fca26784-7fdf-4923-bd07-35d182c2ad14-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jpbgx\" (UID: \"fca26784-7fdf-4923-bd07-35d182c2ad14\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111167 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a3cabe4-69ee-49f7-a783-e72ac1a56821-serving-cert\") pod \"route-controller-manager-6576b87f9c-qs2qs\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111191 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffb5553f-d2d5-4584-9bf8-7212a378f358-serving-cert\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111210 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/003e8201-4e67-4356-b0c1-8cc135451069-serving-cert\") pod \"console-operator-58897d9998-8dj7w\" (UID: \"003e8201-4e67-4356-b0c1-8cc135451069\") " pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111233 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-oauth-config\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111256 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdq6z\" (UniqueName: \"kubernetes.io/projected/fca26784-7fdf-4923-bd07-35d182c2ad14-kube-api-access-hdq6z\") pod \"cluster-samples-operator-665b6dd947-jpbgx\" (UID: \"fca26784-7fdf-4923-bd07-35d182c2ad14\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111280 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111301 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6db10103-96be-4420-b302-a7064e347f61-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vsp2m\" (UID: \"6db10103-96be-4420-b302-a7064e347f61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111323 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpjrd\" (UniqueName: \"kubernetes.io/projected/c83842ec-9933-4f84-bb4a-c84ca61a28e1-kube-api-access-jpjrd\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111344 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-config\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111366 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111387 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn764\" (UniqueName: \"kubernetes.io/projected/27d45de2-e0ab-4c3e-b3da-b20e60e26801-kube-api-access-zn764\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111409 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-trusted-ca-bundle\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111439 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10ac507b-7307-4e09-ab72-b956d0139396-config\") pod \"machine-approver-56656f9798-dhrww\" (UID: \"10ac507b-7307-4e09-ab72-b956d0139396\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111827 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.128310 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.129905 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.130312 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.130516 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.130750 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.130797 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.130909 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.130967 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131058 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131155 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131314 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131426 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131525 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131588 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131626 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131751 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131790 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131060 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131922 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131941 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.132104 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.132218 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.132463 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.132577 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131752 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.132764 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.132783 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.134400 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.137101 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.140101 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.147022 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.149565 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.154231 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.154465 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.154687 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.154882 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.156521 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.163760 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.164083 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.164343 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.164432 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.164818 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8vgmn"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.165133 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.165296 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.165503 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.188608 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.189239 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.189379 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xfcxm"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.189511 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.190577 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556710-lcprh"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.190728 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.191397 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.192062 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556710-lcprh" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.193654 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.193895 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.194359 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qqkbm"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.197205 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-84xjl"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.198359 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.199357 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-8dj7w"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.199462 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.200515 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.201710 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.203038 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.204332 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.205728 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.207024 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wcfj4"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.208392 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.209738 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.211048 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2w96t"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212132 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8bc408a-bca6-42ff-8572-2ba9a3978682-serving-cert\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212159 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks5xn\" (UniqueName: \"kubernetes.io/projected/f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0-kube-api-access-ks5xn\") pod \"openshift-config-operator-7777fb866f-f97pg\" (UID: \"f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212184 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwg6b\" (UniqueName: \"kubernetes.io/projected/255ab2ef-dead-4148-bc85-2514618767b9-kube-api-access-pwg6b\") pod \"openshift-controller-manager-operator-756b6f6bc6-5nslp\" (UID: \"255ab2ef-dead-4148-bc85-2514618767b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212203 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3eaa54fb-8d70-463c-8388-9f8443a480ed-service-ca-bundle\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212221 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f5681b96-47c5-44f8-9e5d-671678930750-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bcfcc\" (UID: \"f5681b96-47c5-44f8-9e5d-671678930750\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212245 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/003e8201-4e67-4356-b0c1-8cc135451069-config\") pod \"console-operator-58897d9998-8dj7w\" (UID: \"003e8201-4e67-4356-b0c1-8cc135451069\") " pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212270 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6db10103-96be-4420-b302-a7064e347f61-images\") pod \"machine-api-operator-5694c8668f-vsp2m\" (UID: \"6db10103-96be-4420-b302-a7064e347f61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212293 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-client-ca\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212318 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0-serving-cert\") pod \"openshift-config-operator-7777fb866f-f97pg\" (UID: \"f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212343 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad2861b-4f40-4551-8aff-304359734792-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dv9wr\" (UID: \"6ad2861b-4f40-4551-8aff-304359734792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212369 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fgk4q\" (UID: \"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212393 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmhx9\" (UniqueName: \"kubernetes.io/projected/e6a94afd-1f9a-4281-9d94-2fac3916f2c3-kube-api-access-kmhx9\") pod \"multus-admission-controller-857f4d67dd-wcfj4\" (UID: \"e6a94afd-1f9a-4281-9d94-2fac3916f2c3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wcfj4" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212419 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ffb5553f-d2d5-4584-9bf8-7212a378f358-audit-dir\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212434 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3eaa54fb-8d70-463c-8388-9f8443a480ed-default-certificate\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212452 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212468 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ffb5553f-d2d5-4584-9bf8-7212a378f358-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212485 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212505 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4a3cd73-aa6c-4128-8a5f-561719e9b170-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmg8q\" (UID: \"a4a3cd73-aa6c-4128-8a5f-561719e9b170\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212521 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9025cb05-7c57-488b-a8cb-441552547aae-config\") pod \"kube-apiserver-operator-766d6c64bb-4rjht\" (UID: \"9025cb05-7c57-488b-a8cb-441552547aae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212546 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/003e8201-4e67-4356-b0c1-8cc135451069-trusted-ca\") pod \"console-operator-58897d9998-8dj7w\" (UID: \"003e8201-4e67-4356-b0c1-8cc135451069\") " pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212565 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e6a94afd-1f9a-4281-9d94-2fac3916f2c3-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wcfj4\" (UID: \"e6a94afd-1f9a-4281-9d94-2fac3916f2c3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wcfj4" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212580 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3e1f747d-78f3-4cbc-b313-eed531936c02-etcd-client\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212598 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-serving-cert\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212616 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkvjg\" (UniqueName: \"kubernetes.io/projected/2c2663fa-7df3-4801-be78-52517eb1f1cf-kube-api-access-gkvjg\") pod \"kube-storage-version-migrator-operator-b67b599dd-v526f\" (UID: \"2c2663fa-7df3-4801-be78-52517eb1f1cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212651 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/10be2947-2e91-4a8e-b54e-69cdab598955-etcd-ca\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212669 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec4b9459-d392-4fc5-9b6f-a87ca50e85b1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-l57bl\" (UID: \"ec4b9459-d392-4fc5-9b6f-a87ca50e85b1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212683 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fgk4q\" (UID: \"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212708 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnwqw\" (UniqueName: \"kubernetes.io/projected/10ac507b-7307-4e09-ab72-b956d0139396-kube-api-access-cnwqw\") pod \"machine-approver-56656f9798-dhrww\" (UID: \"10ac507b-7307-4e09-ab72-b956d0139396\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212723 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4a3cd73-aa6c-4128-8a5f-561719e9b170-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmg8q\" (UID: \"a4a3cd73-aa6c-4128-8a5f-561719e9b170\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212741 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fca26784-7fdf-4923-bd07-35d182c2ad14-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jpbgx\" (UID: \"fca26784-7fdf-4923-bd07-35d182c2ad14\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212760 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a3cabe4-69ee-49f7-a783-e72ac1a56821-serving-cert\") pod \"route-controller-manager-6576b87f9c-qs2qs\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212777 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffb5553f-d2d5-4584-9bf8-7212a378f358-serving-cert\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212792 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/003e8201-4e67-4356-b0c1-8cc135451069-serving-cert\") pod \"console-operator-58897d9998-8dj7w\" (UID: \"003e8201-4e67-4356-b0c1-8cc135451069\") " pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212809 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10be2947-2e91-4a8e-b54e-69cdab598955-config\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212824 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3eaa54fb-8d70-463c-8388-9f8443a480ed-stats-auth\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212838 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n895\" (UniqueName: \"kubernetes.io/projected/f5681b96-47c5-44f8-9e5d-671678930750-kube-api-access-4n895\") pod \"package-server-manager-789f6589d5-bcfcc\" (UID: \"f5681b96-47c5-44f8-9e5d-671678930750\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212855 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdq6z\" (UniqueName: \"kubernetes.io/projected/fca26784-7fdf-4923-bd07-35d182c2ad14-kube-api-access-hdq6z\") pod \"cluster-samples-operator-665b6dd947-jpbgx\" (UID: \"fca26784-7fdf-4923-bd07-35d182c2ad14\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212871 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212885 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec4b9459-d392-4fc5-9b6f-a87ca50e85b1-config\") pod \"kube-controller-manager-operator-78b949d7b-l57bl\" (UID: \"ec4b9459-d392-4fc5-9b6f-a87ca50e85b1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212899 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ec4b9459-d392-4fc5-9b6f-a87ca50e85b1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-l57bl\" (UID: \"ec4b9459-d392-4fc5-9b6f-a87ca50e85b1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212915 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6db10103-96be-4420-b302-a7064e347f61-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vsp2m\" (UID: \"6db10103-96be-4420-b302-a7064e347f61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212932 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpjrd\" (UniqueName: \"kubernetes.io/projected/c83842ec-9933-4f84-bb4a-c84ca61a28e1-kube-api-access-jpjrd\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212948 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzgwf\" (UniqueName: \"kubernetes.io/projected/3eaa54fb-8d70-463c-8388-9f8443a480ed-kube-api-access-fzgwf\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212965 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/00848ba6-522a-45c7-81bd-7ab287d77626-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jrm5t\" (UID: \"00848ba6-522a-45c7-81bd-7ab287d77626\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212990 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ffb5553f-d2d5-4584-9bf8-7212a378f358-encryption-config\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213006 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213024 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/10ac507b-7307-4e09-ab72-b956d0139396-auth-proxy-config\") pod \"machine-approver-56656f9798-dhrww\" (UID: \"10ac507b-7307-4e09-ab72-b956d0139396\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213045 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-service-ca-bundle\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213063 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/255ab2ef-dead-4148-bc85-2514618767b9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5nslp\" (UID: \"255ab2ef-dead-4148-bc85-2514618767b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213082 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213096 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10be2947-2e91-4a8e-b54e-69cdab598955-serving-cert\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213112 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213127 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9025cb05-7c57-488b-a8cb-441552547aae-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4rjht\" (UID: \"9025cb05-7c57-488b-a8cb-441552547aae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213144 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ad2861b-4f40-4551-8aff-304359734792-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dv9wr\" (UID: \"6ad2861b-4f40-4551-8aff-304359734792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213162 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6db10103-96be-4420-b302-a7064e347f61-config\") pod \"machine-api-operator-5694c8668f-vsp2m\" (UID: \"6db10103-96be-4420-b302-a7064e347f61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213182 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvs46\" (UniqueName: \"kubernetes.io/projected/f8bc408a-bca6-42ff-8572-2ba9a3978682-kube-api-access-xvs46\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213199 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vffzw\" (UniqueName: \"kubernetes.io/projected/3e1f747d-78f3-4cbc-b313-eed531936c02-kube-api-access-vffzw\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213216 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-image-import-ca\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213233 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj2pk\" (UniqueName: \"kubernetes.io/projected/5a3cabe4-69ee-49f7-a783-e72ac1a56821-kube-api-access-sj2pk\") pod \"route-controller-manager-6576b87f9c-qs2qs\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213251 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/10be2947-2e91-4a8e-b54e-69cdab598955-etcd-client\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213267 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ffb5553f-d2d5-4584-9bf8-7212a378f358-etcd-client\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213284 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-audit\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213301 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3e1f747d-78f3-4cbc-b313-eed531936c02-node-pullsecrets\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213317 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-config\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213336 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80d5bedc-a598-4779-be24-2d512ea7d148-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2jcxn\" (UID: \"80d5bedc-a598-4779-be24-2d512ea7d148\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213353 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5bhr\" (UniqueName: \"kubernetes.io/projected/80d5bedc-a598-4779-be24-2d512ea7d148-kube-api-access-r5bhr\") pod \"ingress-operator-5b745b69d9-2jcxn\" (UID: \"80d5bedc-a598-4779-be24-2d512ea7d148\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213370 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213386 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a3cabe4-69ee-49f7-a783-e72ac1a56821-client-ca\") pod \"route-controller-manager-6576b87f9c-qs2qs\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213402 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213418 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndkht\" (UniqueName: \"kubernetes.io/projected/a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd-kube-api-access-ndkht\") pod \"cluster-image-registry-operator-dc59b4c8b-fgk4q\" (UID: \"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213435 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-service-ca\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213449 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/255ab2ef-dead-4148-bc85-2514618767b9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5nslp\" (UID: \"255ab2ef-dead-4148-bc85-2514618767b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213471 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2663fa-7df3-4801-be78-52517eb1f1cf-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v526f\" (UID: \"2c2663fa-7df3-4801-be78-52517eb1f1cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213489 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3eaa54fb-8d70-463c-8388-9f8443a480ed-metrics-certs\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213506 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213523 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e1f747d-78f3-4cbc-b313-eed531936c02-audit-dir\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213539 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-etcd-serving-ca\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213554 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-config\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213568 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c45j\" (UniqueName: \"kubernetes.io/projected/00848ba6-522a-45c7-81bd-7ab287d77626-kube-api-access-7c45j\") pod \"control-plane-machine-set-operator-78cbb6b69f-jrm5t\" (UID: \"00848ba6-522a-45c7-81bd-7ab287d77626\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213586 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-serving-cert\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213601 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/10be2947-2e91-4a8e-b54e-69cdab598955-etcd-service-ca\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213620 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e1f747d-78f3-4cbc-b313-eed531936c02-serving-cert\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213648 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zblb5\" (UniqueName: \"kubernetes.io/projected/fe42bd29-b8a7-4a9f-89e2-ab3b944d7c26-kube-api-access-zblb5\") pod \"migrator-59844c95c7-64xpb\" (UID: \"fe42bd29-b8a7-4a9f-89e2-ab3b944d7c26\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-64xpb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213667 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-audit-policies\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213683 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bb2n\" (UniqueName: \"kubernetes.io/projected/44f59229-dec6-4d9b-a63b-bd562b4523cf-kube-api-access-5bb2n\") pod \"machine-config-controller-84d6567774-hffr5\" (UID: \"44f59229-dec6-4d9b-a63b-bd562b4523cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213701 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/80d5bedc-a598-4779-be24-2d512ea7d148-metrics-tls\") pod \"ingress-operator-5b745b69d9-2jcxn\" (UID: \"80d5bedc-a598-4779-be24-2d512ea7d148\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213727 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-oauth-config\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213754 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4a3cd73-aa6c-4128-8a5f-561719e9b170-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmg8q\" (UID: \"a4a3cd73-aa6c-4128-8a5f-561719e9b170\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213778 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fgk4q\" (UID: \"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213803 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-config\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213827 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213850 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn764\" (UniqueName: \"kubernetes.io/projected/27d45de2-e0ab-4c3e-b3da-b20e60e26801-kube-api-access-zn764\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213871 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-trusted-ca-bundle\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213903 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10ac507b-7307-4e09-ab72-b956d0139396-config\") pod \"machine-approver-56656f9798-dhrww\" (UID: \"10ac507b-7307-4e09-ab72-b956d0139396\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213927 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/44f59229-dec6-4d9b-a63b-bd562b4523cf-proxy-tls\") pod \"machine-config-controller-84d6567774-hffr5\" (UID: \"44f59229-dec6-4d9b-a63b-bd562b4523cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213952 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80d5bedc-a598-4779-be24-2d512ea7d148-trusted-ca\") pod \"ingress-operator-5b745b69d9-2jcxn\" (UID: \"80d5bedc-a598-4779-be24-2d512ea7d148\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213969 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27d45de2-e0ab-4c3e-b3da-b20e60e26801-audit-dir\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213989 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a3cabe4-69ee-49f7-a783-e72ac1a56821-config\") pod \"route-controller-manager-6576b87f9c-qs2qs\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214009 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214027 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-config\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214043 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74s9f\" (UniqueName: \"kubernetes.io/projected/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-kube-api-access-74s9f\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214059 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/44f59229-dec6-4d9b-a63b-bd562b4523cf-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hffr5\" (UID: \"44f59229-dec6-4d9b-a63b-bd562b4523cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214076 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0-available-featuregates\") pod \"openshift-config-operator-7777fb866f-f97pg\" (UID: \"f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214091 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2663fa-7df3-4801-be78-52517eb1f1cf-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v526f\" (UID: \"2c2663fa-7df3-4801-be78-52517eb1f1cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214109 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ffb5553f-d2d5-4584-9bf8-7212a378f358-audit-policies\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214124 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsdcp\" (UniqueName: \"kubernetes.io/projected/ffb5553f-d2d5-4584-9bf8-7212a378f358-kube-api-access-zsdcp\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214140 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9blhw\" (UniqueName: \"kubernetes.io/projected/003e8201-4e67-4356-b0c1-8cc135451069-kube-api-access-9blhw\") pod \"console-operator-58897d9998-8dj7w\" (UID: \"003e8201-4e67-4356-b0c1-8cc135451069\") " pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214158 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6dmk\" (UniqueName: \"kubernetes.io/projected/6db10103-96be-4420-b302-a7064e347f61-kube-api-access-q6dmk\") pod \"machine-api-operator-5694c8668f-vsp2m\" (UID: \"6db10103-96be-4420-b302-a7064e347f61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214177 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214192 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214207 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-oauth-serving-cert\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214223 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/10ac507b-7307-4e09-ab72-b956d0139396-machine-approver-tls\") pod \"machine-approver-56656f9798-dhrww\" (UID: \"10ac507b-7307-4e09-ab72-b956d0139396\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214268 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9025cb05-7c57-488b-a8cb-441552547aae-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4rjht\" (UID: \"9025cb05-7c57-488b-a8cb-441552547aae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214288 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffb5553f-d2d5-4584-9bf8-7212a378f358-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214305 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mclx9\" (UniqueName: \"kubernetes.io/projected/6ad2861b-4f40-4551-8aff-304359734792-kube-api-access-mclx9\") pod \"openshift-apiserver-operator-796bbdcf4f-dv9wr\" (UID: \"6ad2861b-4f40-4551-8aff-304359734792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214323 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm2dw\" (UniqueName: \"kubernetes.io/projected/10be2947-2e91-4a8e-b54e-69cdab598955-kube-api-access-mm2dw\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214339 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214363 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3e1f747d-78f3-4cbc-b313-eed531936c02-encryption-config\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.215070 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8q6j6"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.215110 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8ktsx"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.215123 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.216163 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.218092 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/003e8201-4e67-4356-b0c1-8cc135451069-config\") pod \"console-operator-58897d9998-8dj7w\" (UID: \"003e8201-4e67-4356-b0c1-8cc135451069\") " pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.218717 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6db10103-96be-4420-b302-a7064e347f61-images\") pod \"machine-api-operator-5694c8668f-vsp2m\" (UID: \"6db10103-96be-4420-b302-a7064e347f61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.218998 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ffb5553f-d2d5-4584-9bf8-7212a378f358-audit-dir\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.220050 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-q2qpt"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.220088 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vsp2m"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.221282 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a3cabe4-69ee-49f7-a783-e72ac1a56821-serving-cert\") pod \"route-controller-manager-6576b87f9c-qs2qs\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.221746 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad2861b-4f40-4551-8aff-304359734792-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dv9wr\" (UID: \"6ad2861b-4f40-4551-8aff-304359734792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.222470 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-config\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.223515 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6db10103-96be-4420-b302-a7064e347f61-config\") pod \"machine-api-operator-5694c8668f-vsp2m\" (UID: \"6db10103-96be-4420-b302-a7064e347f61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.223795 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3e1f747d-78f3-4cbc-b313-eed531936c02-node-pullsecrets\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.224446 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10ac507b-7307-4e09-ab72-b956d0139396-config\") pod \"machine-approver-56656f9798-dhrww\" (UID: \"10ac507b-7307-4e09-ab72-b956d0139396\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.224486 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/10ac507b-7307-4e09-ab72-b956d0139396-auth-proxy-config\") pod \"machine-approver-56656f9798-dhrww\" (UID: \"10ac507b-7307-4e09-ab72-b956d0139396\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.224512 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ffb5553f-d2d5-4584-9bf8-7212a378f358-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.224702 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffb5553f-d2d5-4584-9bf8-7212a378f358-serving-cert\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.224778 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0-available-featuregates\") pod \"openshift-config-operator-7777fb866f-f97pg\" (UID: \"f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.224935 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-service-ca-bundle\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.225205 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-audit\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.225207 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.225307 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6db10103-96be-4420-b302-a7064e347f61-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vsp2m\" (UID: \"6db10103-96be-4420-b302-a7064e347f61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.225329 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27d45de2-e0ab-4c3e-b3da-b20e60e26801-audit-dir\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.225723 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3e1f747d-78f3-4cbc-b313-eed531936c02-encryption-config\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.225865 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ad2861b-4f40-4551-8aff-304359734792-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dv9wr\" (UID: \"6ad2861b-4f40-4551-8aff-304359734792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.225914 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e1f747d-78f3-4cbc-b313-eed531936c02-audit-dir\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.226054 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ffb5553f-d2d5-4584-9bf8-7212a378f358-audit-policies\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.226136 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a3cabe4-69ee-49f7-a783-e72ac1a56821-client-ca\") pod \"route-controller-manager-6576b87f9c-qs2qs\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.226458 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0-serving-cert\") pod \"openshift-config-operator-7777fb866f-f97pg\" (UID: \"f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.226575 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-etcd-serving-ca\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.226728 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.226878 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e1f747d-78f3-4cbc-b313-eed531936c02-serving-cert\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.227461 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8bc408a-bca6-42ff-8572-2ba9a3978682-serving-cert\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.227467 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-config\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.227556 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffb5553f-d2d5-4584-9bf8-7212a378f358-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.227981 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-client-ca\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.228004 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/003e8201-4e67-4356-b0c1-8cc135451069-serving-cert\") pod \"console-operator-58897d9998-8dj7w\" (UID: \"003e8201-4e67-4356-b0c1-8cc135451069\") " pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.228306 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/10ac507b-7307-4e09-ab72-b956d0139396-machine-approver-tls\") pod \"machine-approver-56656f9798-dhrww\" (UID: \"10ac507b-7307-4e09-ab72-b956d0139396\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.228345 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-84ccm"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.228370 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.228701 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-config\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.228744 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-config\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.229606 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-trusted-ca-bundle\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.230069 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-image-import-ca\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.230241 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-service-ca\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.230393 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-audit-policies\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.230449 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a3cabe4-69ee-49f7-a783-e72ac1a56821-config\") pod \"route-controller-manager-6576b87f9c-qs2qs\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.230596 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/003e8201-4e67-4356-b0c1-8cc135451069-trusted-ca\") pod \"console-operator-58897d9998-8dj7w\" (UID: \"003e8201-4e67-4356-b0c1-8cc135451069\") " pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.230604 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.230871 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.231560 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.231778 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.231852 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.232061 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.232060 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.232392 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.232553 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3e1f747d-78f3-4cbc-b313-eed531936c02-etcd-client\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.233695 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.234496 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.234553 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-z9thp"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.234998 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-serving-cert\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.235308 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.235503 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.236134 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ffb5553f-d2d5-4584-9bf8-7212a378f358-encryption-config\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.236709 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.236993 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-serving-cert\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.237848 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fca26784-7fdf-4923-bd07-35d182c2ad14-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jpbgx\" (UID: \"fca26784-7fdf-4923-bd07-35d182c2ad14\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.238102 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.238115 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.238329 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-oauth-config\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.238737 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9hkj4"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.239383 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9hkj4" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.240874 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-64xpb"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.243422 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.245522 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.247438 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.248023 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ffb5553f-d2d5-4584-9bf8-7212a378f358-etcd-client\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.248713 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-f97pg"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.249746 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.251095 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-l4rxn"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.252307 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8vgmn"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.253664 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.255276 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8zzqp"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.256482 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.257928 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.258225 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.259710 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-z9thp"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.261052 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556710-lcprh"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.262709 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-84xjl"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.263866 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9hkj4"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.265279 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.266488 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xfcxm"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.268147 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.269266 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-9g2bm"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.269836 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9g2bm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.271030 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.278813 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.298408 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315193 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fgk4q\" (UID: \"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315233 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmhx9\" (UniqueName: \"kubernetes.io/projected/e6a94afd-1f9a-4281-9d94-2fac3916f2c3-kube-api-access-kmhx9\") pod \"multus-admission-controller-857f4d67dd-wcfj4\" (UID: \"e6a94afd-1f9a-4281-9d94-2fac3916f2c3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wcfj4" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315255 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3eaa54fb-8d70-463c-8388-9f8443a480ed-default-certificate\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315273 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9025cb05-7c57-488b-a8cb-441552547aae-config\") pod \"kube-apiserver-operator-766d6c64bb-4rjht\" (UID: \"9025cb05-7c57-488b-a8cb-441552547aae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315293 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4a3cd73-aa6c-4128-8a5f-561719e9b170-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmg8q\" (UID: \"a4a3cd73-aa6c-4128-8a5f-561719e9b170\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315314 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e6a94afd-1f9a-4281-9d94-2fac3916f2c3-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wcfj4\" (UID: \"e6a94afd-1f9a-4281-9d94-2fac3916f2c3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wcfj4" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315331 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkvjg\" (UniqueName: \"kubernetes.io/projected/2c2663fa-7df3-4801-be78-52517eb1f1cf-kube-api-access-gkvjg\") pod \"kube-storage-version-migrator-operator-b67b599dd-v526f\" (UID: \"2c2663fa-7df3-4801-be78-52517eb1f1cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315348 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/10be2947-2e91-4a8e-b54e-69cdab598955-etcd-ca\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315364 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec4b9459-d392-4fc5-9b6f-a87ca50e85b1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-l57bl\" (UID: \"ec4b9459-d392-4fc5-9b6f-a87ca50e85b1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315379 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fgk4q\" (UID: \"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315406 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4a3cd73-aa6c-4128-8a5f-561719e9b170-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmg8q\" (UID: \"a4a3cd73-aa6c-4128-8a5f-561719e9b170\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315424 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10be2947-2e91-4a8e-b54e-69cdab598955-config\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315445 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3eaa54fb-8d70-463c-8388-9f8443a480ed-stats-auth\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315463 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n895\" (UniqueName: \"kubernetes.io/projected/f5681b96-47c5-44f8-9e5d-671678930750-kube-api-access-4n895\") pod \"package-server-manager-789f6589d5-bcfcc\" (UID: \"f5681b96-47c5-44f8-9e5d-671678930750\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315479 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec4b9459-d392-4fc5-9b6f-a87ca50e85b1-config\") pod \"kube-controller-manager-operator-78b949d7b-l57bl\" (UID: \"ec4b9459-d392-4fc5-9b6f-a87ca50e85b1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315495 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ec4b9459-d392-4fc5-9b6f-a87ca50e85b1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-l57bl\" (UID: \"ec4b9459-d392-4fc5-9b6f-a87ca50e85b1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315516 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzgwf\" (UniqueName: \"kubernetes.io/projected/3eaa54fb-8d70-463c-8388-9f8443a480ed-kube-api-access-fzgwf\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315531 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/00848ba6-522a-45c7-81bd-7ab287d77626-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jrm5t\" (UID: \"00848ba6-522a-45c7-81bd-7ab287d77626\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315549 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/255ab2ef-dead-4148-bc85-2514618767b9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5nslp\" (UID: \"255ab2ef-dead-4148-bc85-2514618767b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315563 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10be2947-2e91-4a8e-b54e-69cdab598955-serving-cert\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315578 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9025cb05-7c57-488b-a8cb-441552547aae-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4rjht\" (UID: \"9025cb05-7c57-488b-a8cb-441552547aae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315613 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/10be2947-2e91-4a8e-b54e-69cdab598955-etcd-client\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315629 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80d5bedc-a598-4779-be24-2d512ea7d148-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2jcxn\" (UID: \"80d5bedc-a598-4779-be24-2d512ea7d148\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315668 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5bhr\" (UniqueName: \"kubernetes.io/projected/80d5bedc-a598-4779-be24-2d512ea7d148-kube-api-access-r5bhr\") pod \"ingress-operator-5b745b69d9-2jcxn\" (UID: \"80d5bedc-a598-4779-be24-2d512ea7d148\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315688 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndkht\" (UniqueName: \"kubernetes.io/projected/a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd-kube-api-access-ndkht\") pod \"cluster-image-registry-operator-dc59b4c8b-fgk4q\" (UID: \"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315709 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/255ab2ef-dead-4148-bc85-2514618767b9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5nslp\" (UID: \"255ab2ef-dead-4148-bc85-2514618767b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315725 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2663fa-7df3-4801-be78-52517eb1f1cf-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v526f\" (UID: \"2c2663fa-7df3-4801-be78-52517eb1f1cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315742 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3eaa54fb-8d70-463c-8388-9f8443a480ed-metrics-certs\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315758 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c45j\" (UniqueName: \"kubernetes.io/projected/00848ba6-522a-45c7-81bd-7ab287d77626-kube-api-access-7c45j\") pod \"control-plane-machine-set-operator-78cbb6b69f-jrm5t\" (UID: \"00848ba6-522a-45c7-81bd-7ab287d77626\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315777 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/10be2947-2e91-4a8e-b54e-69cdab598955-etcd-service-ca\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315792 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zblb5\" (UniqueName: \"kubernetes.io/projected/fe42bd29-b8a7-4a9f-89e2-ab3b944d7c26-kube-api-access-zblb5\") pod \"migrator-59844c95c7-64xpb\" (UID: \"fe42bd29-b8a7-4a9f-89e2-ab3b944d7c26\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-64xpb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315808 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bb2n\" (UniqueName: \"kubernetes.io/projected/44f59229-dec6-4d9b-a63b-bd562b4523cf-kube-api-access-5bb2n\") pod \"machine-config-controller-84d6567774-hffr5\" (UID: \"44f59229-dec6-4d9b-a63b-bd562b4523cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315824 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/80d5bedc-a598-4779-be24-2d512ea7d148-metrics-tls\") pod \"ingress-operator-5b745b69d9-2jcxn\" (UID: \"80d5bedc-a598-4779-be24-2d512ea7d148\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315843 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4a3cd73-aa6c-4128-8a5f-561719e9b170-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmg8q\" (UID: \"a4a3cd73-aa6c-4128-8a5f-561719e9b170\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315856 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fgk4q\" (UID: \"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315882 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/44f59229-dec6-4d9b-a63b-bd562b4523cf-proxy-tls\") pod \"machine-config-controller-84d6567774-hffr5\" (UID: \"44f59229-dec6-4d9b-a63b-bd562b4523cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315897 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80d5bedc-a598-4779-be24-2d512ea7d148-trusted-ca\") pod \"ingress-operator-5b745b69d9-2jcxn\" (UID: \"80d5bedc-a598-4779-be24-2d512ea7d148\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315925 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/44f59229-dec6-4d9b-a63b-bd562b4523cf-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hffr5\" (UID: \"44f59229-dec6-4d9b-a63b-bd562b4523cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315946 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2663fa-7df3-4801-be78-52517eb1f1cf-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v526f\" (UID: \"2c2663fa-7df3-4801-be78-52517eb1f1cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.316022 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9025cb05-7c57-488b-a8cb-441552547aae-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4rjht\" (UID: \"9025cb05-7c57-488b-a8cb-441552547aae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.316048 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm2dw\" (UniqueName: \"kubernetes.io/projected/10be2947-2e91-4a8e-b54e-69cdab598955-kube-api-access-mm2dw\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.316080 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3eaa54fb-8d70-463c-8388-9f8443a480ed-service-ca-bundle\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.316106 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwg6b\" (UniqueName: \"kubernetes.io/projected/255ab2ef-dead-4148-bc85-2514618767b9-kube-api-access-pwg6b\") pod \"openshift-controller-manager-operator-756b6f6bc6-5nslp\" (UID: \"255ab2ef-dead-4148-bc85-2514618767b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.316129 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f5681b96-47c5-44f8-9e5d-671678930750-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bcfcc\" (UID: \"f5681b96-47c5-44f8-9e5d-671678930750\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.316937 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec4b9459-d392-4fc5-9b6f-a87ca50e85b1-config\") pod \"kube-controller-manager-operator-78b949d7b-l57bl\" (UID: \"ec4b9459-d392-4fc5-9b6f-a87ca50e85b1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.318799 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.321074 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/255ab2ef-dead-4148-bc85-2514618767b9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5nslp\" (UID: \"255ab2ef-dead-4148-bc85-2514618767b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.323302 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/44f59229-dec6-4d9b-a63b-bd562b4523cf-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hffr5\" (UID: \"44f59229-dec6-4d9b-a63b-bd562b4523cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.324919 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9025cb05-7c57-488b-a8cb-441552547aae-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4rjht\" (UID: \"9025cb05-7c57-488b-a8cb-441552547aae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.326167 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fgk4q\" (UID: \"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.328565 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec4b9459-d392-4fc5-9b6f-a87ca50e85b1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-l57bl\" (UID: \"ec4b9459-d392-4fc5-9b6f-a87ca50e85b1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.328595 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/255ab2ef-dead-4148-bc85-2514618767b9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5nslp\" (UID: \"255ab2ef-dead-4148-bc85-2514618767b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.329014 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fgk4q\" (UID: \"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.338581 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.357881 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.378211 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.388883 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4a3cd73-aa6c-4128-8a5f-561719e9b170-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmg8q\" (UID: \"a4a3cd73-aa6c-4128-8a5f-561719e9b170\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.398316 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.407544 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4a3cd73-aa6c-4128-8a5f-561719e9b170-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmg8q\" (UID: \"a4a3cd73-aa6c-4128-8a5f-561719e9b170\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.418190 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.438337 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.439747 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2663fa-7df3-4801-be78-52517eb1f1cf-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v526f\" (UID: \"2c2663fa-7df3-4801-be78-52517eb1f1cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.458934 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.478743 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.482429 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2663fa-7df3-4801-be78-52517eb1f1cf-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v526f\" (UID: \"2c2663fa-7df3-4801-be78-52517eb1f1cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.499036 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.501014 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/10be2947-2e91-4a8e-b54e-69cdab598955-etcd-service-ca\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.518523 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.538961 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.547574 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10be2947-2e91-4a8e-b54e-69cdab598955-config\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.558823 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.568115 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/10be2947-2e91-4a8e-b54e-69cdab598955-etcd-ca\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.578153 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.583441 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10be2947-2e91-4a8e-b54e-69cdab598955-serving-cert\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.598430 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.618231 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.638495 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.644085 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/10be2947-2e91-4a8e-b54e-69cdab598955-etcd-client\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.666600 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.672013 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80d5bedc-a598-4779-be24-2d512ea7d148-trusted-ca\") pod \"ingress-operator-5b745b69d9-2jcxn\" (UID: \"80d5bedc-a598-4779-be24-2d512ea7d148\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.679177 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.699706 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.714142 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/80d5bedc-a598-4779-be24-2d512ea7d148-metrics-tls\") pod \"ingress-operator-5b745b69d9-2jcxn\" (UID: \"80d5bedc-a598-4779-be24-2d512ea7d148\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.718422 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.738765 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.759150 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.763742 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-oauth-serving-cert\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.778750 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.798771 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.819693 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.839347 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.858447 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.868791 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9025cb05-7c57-488b-a8cb-441552547aae-config\") pod \"kube-apiserver-operator-766d6c64bb-4rjht\" (UID: \"9025cb05-7c57-488b-a8cb-441552547aae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.878331 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.898879 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.903257 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3eaa54fb-8d70-463c-8388-9f8443a480ed-metrics-certs\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.919068 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.938155 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.950400 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3eaa54fb-8d70-463c-8388-9f8443a480ed-default-certificate\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.959588 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.969922 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3eaa54fb-8d70-463c-8388-9f8443a480ed-stats-auth\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.978603 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.981228 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3eaa54fb-8d70-463c-8388-9f8443a480ed-service-ca-bundle\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.999206 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.018664 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.025727 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e6a94afd-1f9a-4281-9d94-2fac3916f2c3-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wcfj4\" (UID: \"e6a94afd-1f9a-4281-9d94-2fac3916f2c3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wcfj4" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.039056 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.056670 4837 request.go:700] Waited for 1.013104437s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmultus-ac-dockercfg-9lkdf&limit=500&resourceVersion=0 Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.057955 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.079036 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.099473 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.118539 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.139161 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.159297 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.170864 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f5681b96-47c5-44f8-9e5d-671678930750-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bcfcc\" (UID: \"f5681b96-47c5-44f8-9e5d-671678930750\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.179768 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.198963 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.219654 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.237671 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/00848ba6-522a-45c7-81bd-7ab287d77626-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jrm5t\" (UID: \"00848ba6-522a-45c7-81bd-7ab287d77626\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.238797 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.259057 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.267103 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/44f59229-dec6-4d9b-a63b-bd562b4523cf-proxy-tls\") pod \"machine-config-controller-84d6567774-hffr5\" (UID: \"44f59229-dec6-4d9b-a63b-bd562b4523cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.319111 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.339319 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.358911 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.379031 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.400305 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.419450 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.438404 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.468792 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.480009 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.500015 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.519457 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.539616 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.559417 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.579585 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.598467 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.618728 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.639082 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.658657 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.678127 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.698856 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.718490 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.739343 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.758849 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.778622 4837 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.798897 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.819279 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.861628 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks5xn\" (UniqueName: \"kubernetes.io/projected/f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0-kube-api-access-ks5xn\") pod \"openshift-config-operator-7777fb866f-f97pg\" (UID: \"f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.876521 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdq6z\" (UniqueName: \"kubernetes.io/projected/fca26784-7fdf-4923-bd07-35d182c2ad14-kube-api-access-hdq6z\") pod \"cluster-samples-operator-665b6dd947-jpbgx\" (UID: \"fca26784-7fdf-4923-bd07-35d182c2ad14\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.899747 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpjrd\" (UniqueName: \"kubernetes.io/projected/c83842ec-9933-4f84-bb4a-c84ca61a28e1-kube-api-access-jpjrd\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.912591 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74s9f\" (UniqueName: \"kubernetes.io/projected/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-kube-api-access-74s9f\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.933069 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvs46\" (UniqueName: \"kubernetes.io/projected/f8bc408a-bca6-42ff-8572-2ba9a3978682-kube-api-access-xvs46\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.951683 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vffzw\" (UniqueName: \"kubernetes.io/projected/3e1f747d-78f3-4cbc-b313-eed531936c02-kube-api-access-vffzw\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.973518 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj2pk\" (UniqueName: \"kubernetes.io/projected/5a3cabe4-69ee-49f7-a783-e72ac1a56821-kube-api-access-sj2pk\") pod \"route-controller-manager-6576b87f9c-qs2qs\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.991759 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6dmk\" (UniqueName: \"kubernetes.io/projected/6db10103-96be-4420-b302-a7064e347f61-kube-api-access-q6dmk\") pod \"machine-api-operator-5694c8668f-vsp2m\" (UID: \"6db10103-96be-4420-b302-a7064e347f61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.003151 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.010283 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.017408 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9blhw\" (UniqueName: \"kubernetes.io/projected/003e8201-4e67-4356-b0c1-8cc135451069-kube-api-access-9blhw\") pod \"console-operator-58897d9998-8dj7w\" (UID: \"003e8201-4e67-4356-b0c1-8cc135451069\") " pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.033836 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn764\" (UniqueName: \"kubernetes.io/projected/27d45de2-e0ab-4c3e-b3da-b20e60e26801-kube-api-access-zn764\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.054298 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsdcp\" (UniqueName: \"kubernetes.io/projected/ffb5553f-d2d5-4584-9bf8-7212a378f358-kube-api-access-zsdcp\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.055793 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.057378 4837 request.go:700] Waited for 1.830411174s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/serviceaccounts/openshift-apiserver-operator/token Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.072946 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.078264 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.079686 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mclx9\" (UniqueName: \"kubernetes.io/projected/6ad2861b-4f40-4551-8aff-304359734792-kube-api-access-mclx9\") pod \"openshift-apiserver-operator-796bbdcf4f-dv9wr\" (UID: \"6ad2861b-4f40-4551-8aff-304359734792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.096733 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnwqw\" (UniqueName: \"kubernetes.io/projected/10ac507b-7307-4e09-ab72-b956d0139396-kube-api-access-cnwqw\") pod \"machine-approver-56656f9798-dhrww\" (UID: \"10ac507b-7307-4e09-ab72-b956d0139396\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.099271 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.121373 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.123996 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.128012 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.138824 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.159666 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.170820 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.179862 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.200429 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.207294 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.226008 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.231913 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.239899 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.241118 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.258613 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.261691 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.281226 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.294518 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.315855 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n895\" (UniqueName: \"kubernetes.io/projected/f5681b96-47c5-44f8-9e5d-671678930750-kube-api-access-4n895\") pod \"package-server-manager-789f6589d5-bcfcc\" (UID: \"f5681b96-47c5-44f8-9e5d-671678930750\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.335346 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmhx9\" (UniqueName: \"kubernetes.io/projected/e6a94afd-1f9a-4281-9d94-2fac3916f2c3-kube-api-access-kmhx9\") pod \"multus-admission-controller-857f4d67dd-wcfj4\" (UID: \"e6a94afd-1f9a-4281-9d94-2fac3916f2c3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wcfj4" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.358979 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ec4b9459-d392-4fc5-9b6f-a87ca50e85b1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-l57bl\" (UID: \"ec4b9459-d392-4fc5-9b6f-a87ca50e85b1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.378680 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzgwf\" (UniqueName: \"kubernetes.io/projected/3eaa54fb-8d70-463c-8388-9f8443a480ed-kube-api-access-fzgwf\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.405394 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.413662 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-f97pg"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.416844 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkvjg\" (UniqueName: \"kubernetes.io/projected/2c2663fa-7df3-4801-be78-52517eb1f1cf-kube-api-access-gkvjg\") pod \"kube-storage-version-migrator-operator-b67b599dd-v526f\" (UID: \"2c2663fa-7df3-4801-be78-52517eb1f1cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.433342 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zblb5\" (UniqueName: \"kubernetes.io/projected/fe42bd29-b8a7-4a9f-89e2-ab3b944d7c26-kube-api-access-zblb5\") pod \"migrator-59844c95c7-64xpb\" (UID: \"fe42bd29-b8a7-4a9f-89e2-ab3b944d7c26\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-64xpb" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.433363 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bb2n\" (UniqueName: \"kubernetes.io/projected/44f59229-dec6-4d9b-a63b-bd562b4523cf-kube-api-access-5bb2n\") pod \"machine-config-controller-84d6567774-hffr5\" (UID: \"44f59229-dec6-4d9b-a63b-bd562b4523cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.446385 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-84ccm"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.450467 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-64xpb" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.456017 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndkht\" (UniqueName: \"kubernetes.io/projected/a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd-kube-api-access-ndkht\") pod \"cluster-image-registry-operator-dc59b4c8b-fgk4q\" (UID: \"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.458897 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.462798 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vsp2m"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.476542 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4a3cd73-aa6c-4128-8a5f-561719e9b170-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmg8q\" (UID: \"a4a3cd73-aa6c-4128-8a5f-561719e9b170\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.489746 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.493835 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fgk4q\" (UID: \"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.499412 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wcfj4" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.518915 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.527004 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.527988 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9025cb05-7c57-488b-a8cb-441552547aae-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4rjht\" (UID: \"9025cb05-7c57-488b-a8cb-441552547aae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" Mar 13 11:51:31 crc kubenswrapper[4837]: W0313 11:51:31.532601 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3eaa54fb_8d70_463c_8388_9f8443a480ed.slice/crio-b1b89859732d6e3f130db8770074382bf2ec9c2b4d0b2c135f5f19ccd80108b4 WatchSource:0}: Error finding container b1b89859732d6e3f130db8770074382bf2ec9c2b4d0b2c135f5f19ccd80108b4: Status 404 returned error can't find the container with id b1b89859732d6e3f130db8770074382bf2ec9c2b4d0b2c135f5f19ccd80108b4 Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.544979 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwg6b\" (UniqueName: \"kubernetes.io/projected/255ab2ef-dead-4148-bc85-2514618767b9-kube-api-access-pwg6b\") pod \"openshift-controller-manager-operator-756b6f6bc6-5nslp\" (UID: \"255ab2ef-dead-4148-bc85-2514618767b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.556556 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80d5bedc-a598-4779-be24-2d512ea7d148-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2jcxn\" (UID: \"80d5bedc-a598-4779-be24-2d512ea7d148\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.584080 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm2dw\" (UniqueName: \"kubernetes.io/projected/10be2947-2e91-4a8e-b54e-69cdab598955-kube-api-access-mm2dw\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.586843 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.588911 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qqkbm"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.591671 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-q2qpt"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.596571 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5bhr\" (UniqueName: \"kubernetes.io/projected/80d5bedc-a598-4779-be24-2d512ea7d148-kube-api-access-r5bhr\") pod \"ingress-operator-5b745b69d9-2jcxn\" (UID: \"80d5bedc-a598-4779-be24-2d512ea7d148\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.618086 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c45j\" (UniqueName: \"kubernetes.io/projected/00848ba6-522a-45c7-81bd-7ab287d77626-kube-api-access-7c45j\") pod \"control-plane-machine-set-operator-78cbb6b69f-jrm5t\" (UID: \"00848ba6-522a-45c7-81bd-7ab287d77626\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658369 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658407 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-bound-sa-token\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658430 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658451 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpw7f\" (UniqueName: \"kubernetes.io/projected/416fd214-ef6d-45b4-bf11-a35c92909523-kube-api-access-vpw7f\") pod \"dns-operator-744455d44c-8zzqp\" (UID: \"416fd214-ef6d-45b4-bf11-a35c92909523\") " pod="openshift-dns-operator/dns-operator-744455d44c-8zzqp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658470 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658493 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7xs7\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-kube-api-access-g7xs7\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658510 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/416fd214-ef6d-45b4-bf11-a35c92909523-metrics-tls\") pod \"dns-operator-744455d44c-8zzqp\" (UID: \"416fd214-ef6d-45b4-bf11-a35c92909523\") " pod="openshift-dns-operator/dns-operator-744455d44c-8zzqp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658565 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57-auth-proxy-config\") pod \"machine-config-operator-74547568cd-69xj9\" (UID: \"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658588 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-registry-tls\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658602 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-trusted-ca\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658653 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57-images\") pod \"machine-config-operator-74547568cd-69xj9\" (UID: \"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658669 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b45pg\" (UniqueName: \"kubernetes.io/projected/edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57-kube-api-access-b45pg\") pod \"machine-config-operator-74547568cd-69xj9\" (UID: \"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658691 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57-proxy-tls\") pod \"machine-config-operator-74547568cd-69xj9\" (UID: \"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658717 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-registry-certificates\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658743 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzdbz\" (UniqueName: \"kubernetes.io/projected/85ac6950-8b98-4d0c-8a2b-7eeeac8d1435-kube-api-access-jzdbz\") pod \"downloads-7954f5f757-8ktsx\" (UID: \"85ac6950-8b98-4d0c-8a2b-7eeeac8d1435\") " pod="openshift-console/downloads-7954f5f757-8ktsx" Mar 13 11:51:31 crc kubenswrapper[4837]: E0313 11:51:31.659295 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:32.159284811 +0000 UTC m=+207.797551574 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.689150 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.692860 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.718268 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.744512 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.755537 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8q6j6"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760243 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760418 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6085cb91-fec3-45bd-bfdc-a10e6043049f-metrics-tls\") pod \"dns-default-z9thp\" (UID: \"6085cb91-fec3-45bd-bfdc-a10e6043049f\") " pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760442 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57-auth-proxy-config\") pod \"machine-config-operator-74547568cd-69xj9\" (UID: \"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760462 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d0005e35-a11c-4773-a0d1-94fa4aff8a14-apiservice-cert\") pod \"packageserver-d55dfcdfc-xhx6c\" (UID: \"d0005e35-a11c-4773-a0d1-94fa4aff8a14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760519 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d0005e35-a11c-4773-a0d1-94fa4aff8a14-webhook-cert\") pod \"packageserver-d55dfcdfc-xhx6c\" (UID: \"d0005e35-a11c-4773-a0d1-94fa4aff8a14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760533 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/90308f63-bacc-491b-9ce2-ffbb2eaaea1f-cert\") pod \"ingress-canary-9hkj4\" (UID: \"90308f63-bacc-491b-9ce2-ffbb2eaaea1f\") " pod="openshift-ingress-canary/ingress-canary-9hkj4" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760549 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e366a2cd-5dfa-45c9-b187-92772da0b827-srv-cert\") pod \"olm-operator-6b444d44fb-659h7\" (UID: \"e366a2cd-5dfa-45c9-b187-92772da0b827\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760589 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-registry-tls\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760607 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjqzb\" (UniqueName: \"kubernetes.io/projected/2960b8ba-5517-4915-b524-1f3f6d0f043c-kube-api-access-fjqzb\") pod \"service-ca-9c57cc56f-xfcxm\" (UID: \"2960b8ba-5517-4915-b524-1f3f6d0f043c\") " pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760657 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-trusted-ca\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760674 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d0005e35-a11c-4773-a0d1-94fa4aff8a14-tmpfs\") pod \"packageserver-d55dfcdfc-xhx6c\" (UID: \"d0005e35-a11c-4773-a0d1-94fa4aff8a14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760689 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlwdw\" (UniqueName: \"kubernetes.io/projected/0484d991-f239-47a2-80ff-0237945c27ac-kube-api-access-dlwdw\") pod \"auto-csr-approver-29556710-lcprh\" (UID: \"0484d991-f239-47a2-80ff-0237945c27ac\") " pod="openshift-infra/auto-csr-approver-29556710-lcprh" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760740 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-registration-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.761737 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57-auth-proxy-config\") pod \"machine-config-operator-74547568cd-69xj9\" (UID: \"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.761748 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-mountpoint-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: E0313 11:51:31.761892 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:32.261874203 +0000 UTC m=+207.900140976 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.762059 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-csi-data-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.762081 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-socket-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.762334 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57-images\") pod \"machine-config-operator-74547568cd-69xj9\" (UID: \"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.762347 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-trusted-ca\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.762374 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b45pg\" (UniqueName: \"kubernetes.io/projected/edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57-kube-api-access-b45pg\") pod \"machine-config-operator-74547568cd-69xj9\" (UID: \"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.762512 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl4f8\" (UniqueName: \"kubernetes.io/projected/831db5b2-5229-4b52-8783-f99c640ba856-kube-api-access-pl4f8\") pod \"collect-profiles-29556705-kllhr\" (UID: \"831db5b2-5229-4b52-8783-f99c640ba856\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.762547 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrxsc\" (UniqueName: \"kubernetes.io/projected/6085cb91-fec3-45bd-bfdc-a10e6043049f-kube-api-access-rrxsc\") pod \"dns-default-z9thp\" (UID: \"6085cb91-fec3-45bd-bfdc-a10e6043049f\") " pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.762566 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/41e982da-ccd1-4b0c-9f0e-c220e06052a0-srv-cert\") pod \"catalog-operator-68c6474976-4gsck\" (UID: \"41e982da-ccd1-4b0c-9f0e-c220e06052a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.762729 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57-proxy-tls\") pod \"machine-config-operator-74547568cd-69xj9\" (UID: \"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.762849 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57-images\") pod \"machine-config-operator-74547568cd-69xj9\" (UID: \"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.763441 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8fb85cad-ec2d-4ada-bd68-55937d96a779-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8vgmn\" (UID: \"8fb85cad-ec2d-4ada-bd68-55937d96a779\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.763733 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-registry-certificates\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.764918 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j25rc\" (UniqueName: \"kubernetes.io/projected/8fb85cad-ec2d-4ada-bd68-55937d96a779-kube-api-access-j25rc\") pod \"marketplace-operator-79b997595-8vgmn\" (UID: \"8fb85cad-ec2d-4ada-bd68-55937d96a779\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.765390 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/41e982da-ccd1-4b0c-9f0e-c220e06052a0-profile-collector-cert\") pod \"catalog-operator-68c6474976-4gsck\" (UID: \"41e982da-ccd1-4b0c-9f0e-c220e06052a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.765449 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-registry-certificates\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.765454 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzdbz\" (UniqueName: \"kubernetes.io/projected/85ac6950-8b98-4d0c-8a2b-7eeeac8d1435-kube-api-access-jzdbz\") pod \"downloads-7954f5f757-8ktsx\" (UID: \"85ac6950-8b98-4d0c-8a2b-7eeeac8d1435\") " pod="openshift-console/downloads-7954f5f757-8ktsx" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.765671 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/831db5b2-5229-4b52-8783-f99c640ba856-config-volume\") pod \"collect-profiles-29556705-kllhr\" (UID: \"831db5b2-5229-4b52-8783-f99c640ba856\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.765935 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.766027 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e366a2cd-5dfa-45c9-b187-92772da0b827-profile-collector-cert\") pod \"olm-operator-6b444d44fb-659h7\" (UID: \"e366a2cd-5dfa-45c9-b187-92772da0b827\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.766051 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b42d2c64-cd10-4923-aed0-dc586696da9a-node-bootstrap-token\") pod \"machine-config-server-9g2bm\" (UID: \"b42d2c64-cd10-4923-aed0-dc586696da9a\") " pod="openshift-machine-config-operator/machine-config-server-9g2bm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.766124 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/998432c5-238a-466a-a779-7d5126210706-serving-cert\") pod \"service-ca-operator-777779d784-ng8zt\" (UID: \"998432c5-238a-466a-a779-7d5126210706\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.766179 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfq6d\" (UniqueName: \"kubernetes.io/projected/b42d2c64-cd10-4923-aed0-dc586696da9a-kube-api-access-nfq6d\") pod \"machine-config-server-9g2bm\" (UID: \"b42d2c64-cd10-4923-aed0-dc586696da9a\") " pod="openshift-machine-config-operator/machine-config-server-9g2bm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.766756 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.766830 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llv9w\" (UniqueName: \"kubernetes.io/projected/998432c5-238a-466a-a779-7d5126210706-kube-api-access-llv9w\") pod \"service-ca-operator-777779d784-ng8zt\" (UID: \"998432c5-238a-466a-a779-7d5126210706\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.767005 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-bound-sa-token\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.767175 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: E0313 11:51:31.767491 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:32.267474619 +0000 UTC m=+207.905741452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.767543 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbmq8\" (UniqueName: \"kubernetes.io/projected/8bc71239-c925-4911-bfa5-e7a564dcd654-kube-api-access-dbmq8\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.767762 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2960b8ba-5517-4915-b524-1f3f6d0f043c-signing-key\") pod \"service-ca-9c57cc56f-xfcxm\" (UID: \"2960b8ba-5517-4915-b524-1f3f6d0f043c\") " pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.767959 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpw7f\" (UniqueName: \"kubernetes.io/projected/416fd214-ef6d-45b4-bf11-a35c92909523-kube-api-access-vpw7f\") pod \"dns-operator-744455d44c-8zzqp\" (UID: \"416fd214-ef6d-45b4-bf11-a35c92909523\") " pod="openshift-dns-operator/dns-operator-744455d44c-8zzqp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.767997 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b42d2c64-cd10-4923-aed0-dc586696da9a-certs\") pod \"machine-config-server-9g2bm\" (UID: \"b42d2c64-cd10-4923-aed0-dc586696da9a\") " pod="openshift-machine-config-operator/machine-config-server-9g2bm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.768057 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8kxg\" (UniqueName: \"kubernetes.io/projected/d0005e35-a11c-4773-a0d1-94fa4aff8a14-kube-api-access-d8kxg\") pod \"packageserver-d55dfcdfc-xhx6c\" (UID: \"d0005e35-a11c-4773-a0d1-94fa4aff8a14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.768079 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/998432c5-238a-466a-a779-7d5126210706-config\") pod \"service-ca-operator-777779d784-ng8zt\" (UID: \"998432c5-238a-466a-a779-7d5126210706\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.768717 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57-proxy-tls\") pod \"machine-config-operator-74547568cd-69xj9\" (UID: \"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.768809 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s78rb\" (UniqueName: \"kubernetes.io/projected/90308f63-bacc-491b-9ce2-ffbb2eaaea1f-kube-api-access-s78rb\") pod \"ingress-canary-9hkj4\" (UID: \"90308f63-bacc-491b-9ce2-ffbb2eaaea1f\") " pod="openshift-ingress-canary/ingress-canary-9hkj4" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.768839 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fb85cad-ec2d-4ada-bd68-55937d96a779-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8vgmn\" (UID: \"8fb85cad-ec2d-4ada-bd68-55937d96a779\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.768893 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd28x\" (UniqueName: \"kubernetes.io/projected/41e982da-ccd1-4b0c-9f0e-c220e06052a0-kube-api-access-wd28x\") pod \"catalog-operator-68c6474976-4gsck\" (UID: \"41e982da-ccd1-4b0c-9f0e-c220e06052a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.768933 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-plugins-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.768997 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.769131 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9zsz\" (UniqueName: \"kubernetes.io/projected/e366a2cd-5dfa-45c9-b187-92772da0b827-kube-api-access-d9zsz\") pod \"olm-operator-6b444d44fb-659h7\" (UID: \"e366a2cd-5dfa-45c9-b187-92772da0b827\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.769339 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7xs7\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-kube-api-access-g7xs7\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.771315 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/416fd214-ef6d-45b4-bf11-a35c92909523-metrics-tls\") pod \"dns-operator-744455d44c-8zzqp\" (UID: \"416fd214-ef6d-45b4-bf11-a35c92909523\") " pod="openshift-dns-operator/dns-operator-744455d44c-8zzqp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.771829 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6085cb91-fec3-45bd-bfdc-a10e6043049f-config-volume\") pod \"dns-default-z9thp\" (UID: \"6085cb91-fec3-45bd-bfdc-a10e6043049f\") " pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.772308 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/831db5b2-5229-4b52-8783-f99c640ba856-secret-volume\") pod \"collect-profiles-29556705-kllhr\" (UID: \"831db5b2-5229-4b52-8783-f99c640ba856\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.772512 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.774760 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2960b8ba-5517-4915-b524-1f3f6d0f043c-signing-cabundle\") pod \"service-ca-9c57cc56f-xfcxm\" (UID: \"2960b8ba-5517-4915-b524-1f3f6d0f043c\") " pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.776262 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/416fd214-ef6d-45b4-bf11-a35c92909523-metrics-tls\") pod \"dns-operator-744455d44c-8zzqp\" (UID: \"416fd214-ef6d-45b4-bf11-a35c92909523\") " pod="openshift-dns-operator/dns-operator-744455d44c-8zzqp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.788678 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.788915 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-registry-tls\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.799602 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b45pg\" (UniqueName: \"kubernetes.io/projected/edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57-kube-api-access-b45pg\") pod \"machine-config-operator-74547568cd-69xj9\" (UID: \"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.805903 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.810303 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.811247 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.814685 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzdbz\" (UniqueName: \"kubernetes.io/projected/85ac6950-8b98-4d0c-8a2b-7eeeac8d1435-kube-api-access-jzdbz\") pod \"downloads-7954f5f757-8ktsx\" (UID: \"85ac6950-8b98-4d0c-8a2b-7eeeac8d1435\") " pod="openshift-console/downloads-7954f5f757-8ktsx" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.839522 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.843092 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9dbhc"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.843803 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-bound-sa-token\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.849331 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs"] Mar 13 11:51:31 crc kubenswrapper[4837]: W0313 11:51:31.862307 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8bc408a_bca6_42ff_8572_2ba9a3978682.slice/crio-79d44a31b910bec33360921358068b0857727b0bb4c82bc65255018460fa2174 WatchSource:0}: Error finding container 79d44a31b910bec33360921358068b0857727b0bb4c82bc65255018460fa2174: Status 404 returned error can't find the container with id 79d44a31b910bec33360921358068b0857727b0bb4c82bc65255018460fa2174 Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.864670 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.866562 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wcfj4"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.869747 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpw7f\" (UniqueName: \"kubernetes.io/projected/416fd214-ef6d-45b4-bf11-a35c92909523-kube-api-access-vpw7f\") pod \"dns-operator-744455d44c-8zzqp\" (UID: \"416fd214-ef6d-45b4-bf11-a35c92909523\") " pod="openshift-dns-operator/dns-operator-744455d44c-8zzqp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875440 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875583 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2960b8ba-5517-4915-b524-1f3f6d0f043c-signing-cabundle\") pod \"service-ca-9c57cc56f-xfcxm\" (UID: \"2960b8ba-5517-4915-b524-1f3f6d0f043c\") " pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875615 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6085cb91-fec3-45bd-bfdc-a10e6043049f-metrics-tls\") pod \"dns-default-z9thp\" (UID: \"6085cb91-fec3-45bd-bfdc-a10e6043049f\") " pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875633 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d0005e35-a11c-4773-a0d1-94fa4aff8a14-apiservice-cert\") pod \"packageserver-d55dfcdfc-xhx6c\" (UID: \"d0005e35-a11c-4773-a0d1-94fa4aff8a14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875667 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d0005e35-a11c-4773-a0d1-94fa4aff8a14-webhook-cert\") pod \"packageserver-d55dfcdfc-xhx6c\" (UID: \"d0005e35-a11c-4773-a0d1-94fa4aff8a14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875681 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/90308f63-bacc-491b-9ce2-ffbb2eaaea1f-cert\") pod \"ingress-canary-9hkj4\" (UID: \"90308f63-bacc-491b-9ce2-ffbb2eaaea1f\") " pod="openshift-ingress-canary/ingress-canary-9hkj4" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875694 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e366a2cd-5dfa-45c9-b187-92772da0b827-srv-cert\") pod \"olm-operator-6b444d44fb-659h7\" (UID: \"e366a2cd-5dfa-45c9-b187-92772da0b827\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875711 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjqzb\" (UniqueName: \"kubernetes.io/projected/2960b8ba-5517-4915-b524-1f3f6d0f043c-kube-api-access-fjqzb\") pod \"service-ca-9c57cc56f-xfcxm\" (UID: \"2960b8ba-5517-4915-b524-1f3f6d0f043c\") " pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875723 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d0005e35-a11c-4773-a0d1-94fa4aff8a14-tmpfs\") pod \"packageserver-d55dfcdfc-xhx6c\" (UID: \"d0005e35-a11c-4773-a0d1-94fa4aff8a14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875737 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlwdw\" (UniqueName: \"kubernetes.io/projected/0484d991-f239-47a2-80ff-0237945c27ac-kube-api-access-dlwdw\") pod \"auto-csr-approver-29556710-lcprh\" (UID: \"0484d991-f239-47a2-80ff-0237945c27ac\") " pod="openshift-infra/auto-csr-approver-29556710-lcprh" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875763 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-registration-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875779 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-csi-data-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875800 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-mountpoint-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875817 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl4f8\" (UniqueName: \"kubernetes.io/projected/831db5b2-5229-4b52-8783-f99c640ba856-kube-api-access-pl4f8\") pod \"collect-profiles-29556705-kllhr\" (UID: \"831db5b2-5229-4b52-8783-f99c640ba856\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875831 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrxsc\" (UniqueName: \"kubernetes.io/projected/6085cb91-fec3-45bd-bfdc-a10e6043049f-kube-api-access-rrxsc\") pod \"dns-default-z9thp\" (UID: \"6085cb91-fec3-45bd-bfdc-a10e6043049f\") " pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875846 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-socket-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875869 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/41e982da-ccd1-4b0c-9f0e-c220e06052a0-srv-cert\") pod \"catalog-operator-68c6474976-4gsck\" (UID: \"41e982da-ccd1-4b0c-9f0e-c220e06052a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875889 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8fb85cad-ec2d-4ada-bd68-55937d96a779-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8vgmn\" (UID: \"8fb85cad-ec2d-4ada-bd68-55937d96a779\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875910 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j25rc\" (UniqueName: \"kubernetes.io/projected/8fb85cad-ec2d-4ada-bd68-55937d96a779-kube-api-access-j25rc\") pod \"marketplace-operator-79b997595-8vgmn\" (UID: \"8fb85cad-ec2d-4ada-bd68-55937d96a779\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875926 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/41e982da-ccd1-4b0c-9f0e-c220e06052a0-profile-collector-cert\") pod \"catalog-operator-68c6474976-4gsck\" (UID: \"41e982da-ccd1-4b0c-9f0e-c220e06052a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875947 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/831db5b2-5229-4b52-8783-f99c640ba856-config-volume\") pod \"collect-profiles-29556705-kllhr\" (UID: \"831db5b2-5229-4b52-8783-f99c640ba856\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875962 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e366a2cd-5dfa-45c9-b187-92772da0b827-profile-collector-cert\") pod \"olm-operator-6b444d44fb-659h7\" (UID: \"e366a2cd-5dfa-45c9-b187-92772da0b827\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875978 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b42d2c64-cd10-4923-aed0-dc586696da9a-node-bootstrap-token\") pod \"machine-config-server-9g2bm\" (UID: \"b42d2c64-cd10-4923-aed0-dc586696da9a\") " pod="openshift-machine-config-operator/machine-config-server-9g2bm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875996 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/998432c5-238a-466a-a779-7d5126210706-serving-cert\") pod \"service-ca-operator-777779d784-ng8zt\" (UID: \"998432c5-238a-466a-a779-7d5126210706\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876015 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfq6d\" (UniqueName: \"kubernetes.io/projected/b42d2c64-cd10-4923-aed0-dc586696da9a-kube-api-access-nfq6d\") pod \"machine-config-server-9g2bm\" (UID: \"b42d2c64-cd10-4923-aed0-dc586696da9a\") " pod="openshift-machine-config-operator/machine-config-server-9g2bm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876039 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llv9w\" (UniqueName: \"kubernetes.io/projected/998432c5-238a-466a-a779-7d5126210706-kube-api-access-llv9w\") pod \"service-ca-operator-777779d784-ng8zt\" (UID: \"998432c5-238a-466a-a779-7d5126210706\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876062 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbmq8\" (UniqueName: \"kubernetes.io/projected/8bc71239-c925-4911-bfa5-e7a564dcd654-kube-api-access-dbmq8\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876077 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2960b8ba-5517-4915-b524-1f3f6d0f043c-signing-key\") pod \"service-ca-9c57cc56f-xfcxm\" (UID: \"2960b8ba-5517-4915-b524-1f3f6d0f043c\") " pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876102 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b42d2c64-cd10-4923-aed0-dc586696da9a-certs\") pod \"machine-config-server-9g2bm\" (UID: \"b42d2c64-cd10-4923-aed0-dc586696da9a\") " pod="openshift-machine-config-operator/machine-config-server-9g2bm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876131 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8kxg\" (UniqueName: \"kubernetes.io/projected/d0005e35-a11c-4773-a0d1-94fa4aff8a14-kube-api-access-d8kxg\") pod \"packageserver-d55dfcdfc-xhx6c\" (UID: \"d0005e35-a11c-4773-a0d1-94fa4aff8a14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876153 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/998432c5-238a-466a-a779-7d5126210706-config\") pod \"service-ca-operator-777779d784-ng8zt\" (UID: \"998432c5-238a-466a-a779-7d5126210706\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876176 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s78rb\" (UniqueName: \"kubernetes.io/projected/90308f63-bacc-491b-9ce2-ffbb2eaaea1f-kube-api-access-s78rb\") pod \"ingress-canary-9hkj4\" (UID: \"90308f63-bacc-491b-9ce2-ffbb2eaaea1f\") " pod="openshift-ingress-canary/ingress-canary-9hkj4" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876196 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fb85cad-ec2d-4ada-bd68-55937d96a779-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8vgmn\" (UID: \"8fb85cad-ec2d-4ada-bd68-55937d96a779\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876216 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd28x\" (UniqueName: \"kubernetes.io/projected/41e982da-ccd1-4b0c-9f0e-c220e06052a0-kube-api-access-wd28x\") pod \"catalog-operator-68c6474976-4gsck\" (UID: \"41e982da-ccd1-4b0c-9f0e-c220e06052a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876239 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-plugins-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876271 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9zsz\" (UniqueName: \"kubernetes.io/projected/e366a2cd-5dfa-45c9-b187-92772da0b827-kube-api-access-d9zsz\") pod \"olm-operator-6b444d44fb-659h7\" (UID: \"e366a2cd-5dfa-45c9-b187-92772da0b827\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876300 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6085cb91-fec3-45bd-bfdc-a10e6043049f-config-volume\") pod \"dns-default-z9thp\" (UID: \"6085cb91-fec3-45bd-bfdc-a10e6043049f\") " pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876315 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/831db5b2-5229-4b52-8783-f99c640ba856-secret-volume\") pod \"collect-profiles-29556705-kllhr\" (UID: \"831db5b2-5229-4b52-8783-f99c640ba856\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.877141 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-registration-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.877166 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d0005e35-a11c-4773-a0d1-94fa4aff8a14-tmpfs\") pod \"packageserver-d55dfcdfc-xhx6c\" (UID: \"d0005e35-a11c-4773-a0d1-94fa4aff8a14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.877228 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-csi-data-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: E0313 11:51:31.877247 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:32.377229685 +0000 UTC m=+208.015496448 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.877281 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-mountpoint-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.877467 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-socket-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.880169 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-plugins-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.881075 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fb85cad-ec2d-4ada-bd68-55937d96a779-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8vgmn\" (UID: \"8fb85cad-ec2d-4ada-bd68-55937d96a779\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:31 crc kubenswrapper[4837]: W0313 11:51:31.881308 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a3cabe4_69ee_49f7_a783_e72ac1a56821.slice/crio-2bc8a3d69075e5c30fa5b45ad6a0c6f1944dbbd0064acaad5eadf14dc600adc9 WatchSource:0}: Error finding container 2bc8a3d69075e5c30fa5b45ad6a0c6f1944dbbd0064acaad5eadf14dc600adc9: Status 404 returned error can't find the container with id 2bc8a3d69075e5c30fa5b45ad6a0c6f1944dbbd0064acaad5eadf14dc600adc9 Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.881412 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2960b8ba-5517-4915-b524-1f3f6d0f043c-signing-cabundle\") pod \"service-ca-9c57cc56f-xfcxm\" (UID: \"2960b8ba-5517-4915-b524-1f3f6d0f043c\") " pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.881740 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/831db5b2-5229-4b52-8783-f99c640ba856-config-volume\") pod \"collect-profiles-29556705-kllhr\" (UID: \"831db5b2-5229-4b52-8783-f99c640ba856\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.883048 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e366a2cd-5dfa-45c9-b187-92772da0b827-profile-collector-cert\") pod \"olm-operator-6b444d44fb-659h7\" (UID: \"e366a2cd-5dfa-45c9-b187-92772da0b827\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.883241 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b42d2c64-cd10-4923-aed0-dc586696da9a-node-bootstrap-token\") pod \"machine-config-server-9g2bm\" (UID: \"b42d2c64-cd10-4923-aed0-dc586696da9a\") " pod="openshift-machine-config-operator/machine-config-server-9g2bm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.883436 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/41e982da-ccd1-4b0c-9f0e-c220e06052a0-srv-cert\") pod \"catalog-operator-68c6474976-4gsck\" (UID: \"41e982da-ccd1-4b0c-9f0e-c220e06052a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.883619 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d0005e35-a11c-4773-a0d1-94fa4aff8a14-webhook-cert\") pod \"packageserver-d55dfcdfc-xhx6c\" (UID: \"d0005e35-a11c-4773-a0d1-94fa4aff8a14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.884074 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/831db5b2-5229-4b52-8783-f99c640ba856-secret-volume\") pod \"collect-profiles-29556705-kllhr\" (UID: \"831db5b2-5229-4b52-8783-f99c640ba856\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.884368 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d0005e35-a11c-4773-a0d1-94fa4aff8a14-apiservice-cert\") pod \"packageserver-d55dfcdfc-xhx6c\" (UID: \"d0005e35-a11c-4773-a0d1-94fa4aff8a14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.885211 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.885996 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8fb85cad-ec2d-4ada-bd68-55937d96a779-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8vgmn\" (UID: \"8fb85cad-ec2d-4ada-bd68-55937d96a779\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.886286 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6085cb91-fec3-45bd-bfdc-a10e6043049f-metrics-tls\") pod \"dns-default-z9thp\" (UID: \"6085cb91-fec3-45bd-bfdc-a10e6043049f\") " pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.886461 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/41e982da-ccd1-4b0c-9f0e-c220e06052a0-profile-collector-cert\") pod \"catalog-operator-68c6474976-4gsck\" (UID: \"41e982da-ccd1-4b0c-9f0e-c220e06052a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.886493 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b42d2c64-cd10-4923-aed0-dc586696da9a-certs\") pod \"machine-config-server-9g2bm\" (UID: \"b42d2c64-cd10-4923-aed0-dc586696da9a\") " pod="openshift-machine-config-operator/machine-config-server-9g2bm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.886468 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e366a2cd-5dfa-45c9-b187-92772da0b827-srv-cert\") pod \"olm-operator-6b444d44fb-659h7\" (UID: \"e366a2cd-5dfa-45c9-b187-92772da0b827\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.896112 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7xs7\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-kube-api-access-g7xs7\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.935735 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j25rc\" (UniqueName: \"kubernetes.io/projected/8fb85cad-ec2d-4ada-bd68-55937d96a779-kube-api-access-j25rc\") pod \"marketplace-operator-79b997595-8vgmn\" (UID: \"8fb85cad-ec2d-4ada-bd68-55937d96a779\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.937914 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-8dj7w"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.948382 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6085cb91-fec3-45bd-bfdc-a10e6043049f-config-volume\") pod \"dns-default-z9thp\" (UID: \"6085cb91-fec3-45bd-bfdc-a10e6043049f\") " pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.950295 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/998432c5-238a-466a-a779-7d5126210706-config\") pod \"service-ca-operator-777779d784-ng8zt\" (UID: \"998432c5-238a-466a-a779-7d5126210706\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.951964 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/998432c5-238a-466a-a779-7d5126210706-serving-cert\") pod \"service-ca-operator-777779d784-ng8zt\" (UID: \"998432c5-238a-466a-a779-7d5126210706\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.952218 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/90308f63-bacc-491b-9ce2-ffbb2eaaea1f-cert\") pod \"ingress-canary-9hkj4\" (UID: \"90308f63-bacc-491b-9ce2-ffbb2eaaea1f\") " pod="openshift-ingress-canary/ingress-canary-9hkj4" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.953091 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlwdw\" (UniqueName: \"kubernetes.io/projected/0484d991-f239-47a2-80ff-0237945c27ac-kube-api-access-dlwdw\") pod \"auto-csr-approver-29556710-lcprh\" (UID: \"0484d991-f239-47a2-80ff-0237945c27ac\") " pod="openshift-infra/auto-csr-approver-29556710-lcprh" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.958056 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2960b8ba-5517-4915-b524-1f3f6d0f043c-signing-key\") pod \"service-ca-9c57cc56f-xfcxm\" (UID: \"2960b8ba-5517-4915-b524-1f3f6d0f043c\") " pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.975532 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl4f8\" (UniqueName: \"kubernetes.io/projected/831db5b2-5229-4b52-8783-f99c640ba856-kube-api-access-pl4f8\") pod \"collect-profiles-29556705-kllhr\" (UID: \"831db5b2-5229-4b52-8783-f99c640ba856\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.978472 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: E0313 11:51:31.978961 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:32.478946229 +0000 UTC m=+208.117212992 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.982140 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-64xpb"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.984565 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9tkxg" event={"ID":"3eaa54fb-8d70-463c-8388-9f8443a480ed","Type":"ContainerStarted","Data":"b1b89859732d6e3f130db8770074382bf2ec9c2b4d0b2c135f5f19ccd80108b4"} Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.985859 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" event={"ID":"6db10103-96be-4420-b302-a7064e347f61","Type":"ContainerStarted","Data":"b8b9904f90ea9cab9b908c8386f85ff72414d4e5b210240fa04eb6214cfb4a49"} Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.991519 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" event={"ID":"27d45de2-e0ab-4c3e-b3da-b20e60e26801","Type":"ContainerStarted","Data":"48f88856d0aa99c22451af4774004c789a7baf644ed71ee96a301b56c7368078"} Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.992474 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.994157 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8ktsx" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.998175 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrxsc\" (UniqueName: \"kubernetes.io/projected/6085cb91-fec3-45bd-bfdc-a10e6043049f-kube-api-access-rrxsc\") pod \"dns-default-z9thp\" (UID: \"6085cb91-fec3-45bd-bfdc-a10e6043049f\") " pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.998318 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q2qpt" event={"ID":"c83842ec-9933-4f84-bb4a-c84ca61a28e1","Type":"ContainerStarted","Data":"6d6886f8a08a9d6498bf2731a6faf601bf8b43c566b4a0dbe066c5557e5e15e0"} Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.010263 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" event={"ID":"d8974a7e-ac32-4644-b7ee-2d3908daf2fa","Type":"ContainerStarted","Data":"8bd1e25605040a3ded2b2fcdf3aba6d9b1057256f3fd36ccbc6a37df5954cca1"} Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.015414 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" event={"ID":"3e1f747d-78f3-4cbc-b313-eed531936c02","Type":"ContainerStarted","Data":"3724ccf51dcc3ce781ab7f660589de5a9700c6b6b97a2a3012f9580d869b7a9e"} Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.016975 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" event={"ID":"5a3cabe4-69ee-49f7-a783-e72ac1a56821","Type":"ContainerStarted","Data":"2bc8a3d69075e5c30fa5b45ad6a0c6f1944dbbd0064acaad5eadf14dc600adc9"} Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.024825 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" event={"ID":"f8bc408a-bca6-42ff-8572-2ba9a3978682","Type":"ContainerStarted","Data":"79d44a31b910bec33360921358068b0857727b0bb4c82bc65255018460fa2174"} Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.025461 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjqzb\" (UniqueName: \"kubernetes.io/projected/2960b8ba-5517-4915-b524-1f3f6d0f043c-kube-api-access-fjqzb\") pod \"service-ca-9c57cc56f-xfcxm\" (UID: \"2960b8ba-5517-4915-b524-1f3f6d0f043c\") " pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.029985 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" event={"ID":"ffb5553f-d2d5-4584-9bf8-7212a378f358","Type":"ContainerStarted","Data":"e21d8c3cf5026263c4f5424f66828ba6aa5db357c326f97aa914eb0972b97eb0"} Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.035194 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llv9w\" (UniqueName: \"kubernetes.io/projected/998432c5-238a-466a-a779-7d5126210706-kube-api-access-llv9w\") pod \"service-ca-operator-777779d784-ng8zt\" (UID: \"998432c5-238a-466a-a779-7d5126210706\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.036342 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" event={"ID":"10ac507b-7307-4e09-ab72-b956d0139396","Type":"ContainerStarted","Data":"4bd51b06146c5f096b2a54598de78033d1db9d6e3a286772a166210354044d28"} Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.041686 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" event={"ID":"f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0","Type":"ContainerStarted","Data":"08dd677887c4dde1b1c0188517f8488051597c9a84fd864cd57af21ec82e64e6"} Mar 13 11:51:32 crc kubenswrapper[4837]: W0313 11:51:32.041766 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44f59229_dec6_4d9b_a63b_bd562b4523cf.slice/crio-3ef1ad2cfd596ea09a0783d40ce9413e65a0ca7227111a529229125c392c9dd0 WatchSource:0}: Error finding container 3ef1ad2cfd596ea09a0783d40ce9413e65a0ca7227111a529229125c392c9dd0: Status 404 returned error can't find the container with id 3ef1ad2cfd596ea09a0783d40ce9413e65a0ca7227111a529229125c392c9dd0 Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.043343 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl"] Mar 13 11:51:32 crc kubenswrapper[4837]: W0313 11:51:32.046128 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod003e8201_4e67_4356_b0c1_8cc135451069.slice/crio-bac019194e3b35d01c48fc764e4ea75894e6d260733b68a6e762af1ad5c86dc7 WatchSource:0}: Error finding container bac019194e3b35d01c48fc764e4ea75894e6d260733b68a6e762af1ad5c86dc7: Status 404 returned error can't find the container with id bac019194e3b35d01c48fc764e4ea75894e6d260733b68a6e762af1ad5c86dc7 Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.048614 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp"] Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.055205 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfq6d\" (UniqueName: \"kubernetes.io/projected/b42d2c64-cd10-4923-aed0-dc586696da9a-kube-api-access-nfq6d\") pod \"machine-config-server-9g2bm\" (UID: \"b42d2c64-cd10-4923-aed0-dc586696da9a\") " pod="openshift-machine-config-operator/machine-config-server-9g2bm" Mar 13 11:51:32 crc kubenswrapper[4837]: W0313 11:51:32.057648 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c2663fa_7df3_4801_be78_52517eb1f1cf.slice/crio-87b29cac5b0d8102de47d1b2f0bf13cd354e9c6469e4cd5a1e05a6523362cce8 WatchSource:0}: Error finding container 87b29cac5b0d8102de47d1b2f0bf13cd354e9c6469e4cd5a1e05a6523362cce8: Status 404 returned error can't find the container with id 87b29cac5b0d8102de47d1b2f0bf13cd354e9c6469e4cd5a1e05a6523362cce8 Mar 13 11:51:32 crc kubenswrapper[4837]: W0313 11:51:32.058519 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe42bd29_b8a7_4a9f_89e2_ab3b944d7c26.slice/crio-947cbb59d184a3abaa202539dda8ef07f684be8aca67a5026c36c546ecb7c17d WatchSource:0}: Error finding container 947cbb59d184a3abaa202539dda8ef07f684be8aca67a5026c36c546ecb7c17d: Status 404 returned error can't find the container with id 947cbb59d184a3abaa202539dda8ef07f684be8aca67a5026c36c546ecb7c17d Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.073109 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd28x\" (UniqueName: \"kubernetes.io/projected/41e982da-ccd1-4b0c-9f0e-c220e06052a0-kube-api-access-wd28x\") pod \"catalog-operator-68c6474976-4gsck\" (UID: \"41e982da-ccd1-4b0c-9f0e-c220e06052a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.079610 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:32 crc kubenswrapper[4837]: E0313 11:51:32.080017 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:32.579997042 +0000 UTC m=+208.218263815 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.081349 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-8zzqp" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.086305 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc"] Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.091998 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9zsz\" (UniqueName: \"kubernetes.io/projected/e366a2cd-5dfa-45c9-b187-92772da0b827-kube-api-access-d9zsz\") pod \"olm-operator-6b444d44fb-659h7\" (UID: \"e366a2cd-5dfa-45c9-b187-92772da0b827\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:32 crc kubenswrapper[4837]: W0313 11:51:32.092711 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod255ab2ef_dead_4148_bc85_2514618767b9.slice/crio-eb83a01e844ced5ebb50f5c6f4f23872dfc2208d1286a3c2f7a421eb4b88a923 WatchSource:0}: Error finding container eb83a01e844ced5ebb50f5c6f4f23872dfc2208d1286a3c2f7a421eb4b88a923: Status 404 returned error can't find the container with id eb83a01e844ced5ebb50f5c6f4f23872dfc2208d1286a3c2f7a421eb4b88a923 Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.124657 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8kxg\" (UniqueName: \"kubernetes.io/projected/d0005e35-a11c-4773-a0d1-94fa4aff8a14-kube-api-access-d8kxg\") pod \"packageserver-d55dfcdfc-xhx6c\" (UID: \"d0005e35-a11c-4773-a0d1-94fa4aff8a14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.139121 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s78rb\" (UniqueName: \"kubernetes.io/projected/90308f63-bacc-491b-9ce2-ffbb2eaaea1f-kube-api-access-s78rb\") pod \"ingress-canary-9hkj4\" (UID: \"90308f63-bacc-491b-9ce2-ffbb2eaaea1f\") " pod="openshift-ingress-canary/ingress-canary-9hkj4" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.147134 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.150858 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.158232 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbmq8\" (UniqueName: \"kubernetes.io/projected/8bc71239-c925-4911-bfa5-e7a564dcd654-kube-api-access-dbmq8\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.165202 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.172507 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.182568 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:32 crc kubenswrapper[4837]: E0313 11:51:32.183345 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:32.683325797 +0000 UTC m=+208.321592560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.183576 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.195304 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.226344 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556710-lcprh" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.234439 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.248032 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.273528 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.281822 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9hkj4" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.284668 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:32 crc kubenswrapper[4837]: E0313 11:51:32.285050 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:32.785028531 +0000 UTC m=+208.423295294 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.287755 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9g2bm" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.386426 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:32 crc kubenswrapper[4837]: E0313 11:51:32.389153 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:32.888702427 +0000 UTC m=+208.526969190 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.487748 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:32 crc kubenswrapper[4837]: E0313 11:51:32.488102 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:32.988071168 +0000 UTC m=+208.626337931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.531789 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9"] Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.578813 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q"] Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.580583 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn"] Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.589015 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:32 crc kubenswrapper[4837]: E0313 11:51:32.589442 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:33.089428411 +0000 UTC m=+208.727695174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.598951 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t"] Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.606602 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q"] Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.660558 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht"] Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.683823 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-l4rxn"] Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.692433 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:32 crc kubenswrapper[4837]: E0313 11:51:32.692688 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:33.192621461 +0000 UTC m=+208.830888224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.692813 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:32 crc kubenswrapper[4837]: E0313 11:51:32.693215 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:33.1932007 +0000 UTC m=+208.831467463 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.793987 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:32 crc kubenswrapper[4837]: E0313 11:51:32.794583 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:33.294567843 +0000 UTC m=+208.932834606 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.908181 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:32 crc kubenswrapper[4837]: E0313 11:51:32.908517 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:33.40850217 +0000 UTC m=+209.046768933 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.941095 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8ktsx"] Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.009728 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.010064 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:33.510039599 +0000 UTC m=+209.148306362 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: W0313 11:51:33.105916 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85ac6950_8b98_4d0c_8a2b_7eeeac8d1435.slice/crio-e010eb156dd5e54ff5b67933f30569e7dfedf9e8ebf856adbf791bedb5ca7007 WatchSource:0}: Error finding container e010eb156dd5e54ff5b67933f30569e7dfedf9e8ebf856adbf791bedb5ca7007: Status 404 returned error can't find the container with id e010eb156dd5e54ff5b67933f30569e7dfedf9e8ebf856adbf791bedb5ca7007 Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.106820 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.106851 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.106868 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" event={"ID":"9025cb05-7c57-488b-a8cb-441552547aae","Type":"ContainerStarted","Data":"fe0205be8dce14655377c945d336b37f32e1ec709e068892e4478220341b3086"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.106890 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" event={"ID":"5a3cabe4-69ee-49f7-a783-e72ac1a56821","Type":"ContainerStarted","Data":"0b1af16cc6188236788eb10501019d25c79e6c73c18075a85efbfcfdd6e8d90d"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.106899 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" event={"ID":"27d45de2-e0ab-4c3e-b3da-b20e60e26801","Type":"ContainerStarted","Data":"7788f0babcbd0ba3005289dc42abd3560a56f1f0efe57b0376342454820793c4"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.106910 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.106919 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" event={"ID":"f8bc408a-bca6-42ff-8572-2ba9a3978682","Type":"ContainerStarted","Data":"0737572e5f80685157a6578fd12aead5fdbe12b0fbb802f48732112a9a3e2ca5"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.106927 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" event={"ID":"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57","Type":"ContainerStarted","Data":"d15d6148e3b4641e21c1219be8fd949b853388c951967107866968824b8b411d"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.106936 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" event={"ID":"d8974a7e-ac32-4644-b7ee-2d3908daf2fa","Type":"ContainerStarted","Data":"0f0ac7761242186c6486cc067d961428306b628dd37e60d608bfeec53567ce76"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.107781 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9tkxg" event={"ID":"3eaa54fb-8d70-463c-8388-9f8443a480ed","Type":"ContainerStarted","Data":"8909e983ef2780c7ed608dd72d62ffc2711e88e6e546fc3ca22041d9c9d9f368"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.111240 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.111725 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:33.611707772 +0000 UTC m=+209.249974535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.118907 4837 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-8q6j6 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.118960 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" podUID="27d45de2-e0ab-4c3e-b3da-b20e60e26801" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.118907 4837 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-9dbhc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.119046 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" podUID="f8bc408a-bca6-42ff-8572-2ba9a3978682" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.118921 4837 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-qs2qs container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.119122 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" podUID="5a3cabe4-69ee-49f7-a783-e72ac1a56821" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.120137 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-64xpb" event={"ID":"fe42bd29-b8a7-4a9f-89e2-ab3b944d7c26","Type":"ContainerStarted","Data":"6ca9ef7b6e1aeeb5f213e3151501982326403f14efe8a0393d6e04b54a7e03b1"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.120189 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-64xpb" event={"ID":"fe42bd29-b8a7-4a9f-89e2-ab3b944d7c26","Type":"ContainerStarted","Data":"947cbb59d184a3abaa202539dda8ef07f684be8aca67a5026c36c546ecb7c17d"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.125072 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-8dj7w" event={"ID":"003e8201-4e67-4356-b0c1-8cc135451069","Type":"ContainerStarted","Data":"24b5d222c04ee931204ab5357fdad08cdce8ed10f00338509d9dda5089a76343"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.125109 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-8dj7w" event={"ID":"003e8201-4e67-4356-b0c1-8cc135451069","Type":"ContainerStarted","Data":"bac019194e3b35d01c48fc764e4ea75894e6d260733b68a6e762af1ad5c86dc7"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.125600 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.127700 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" event={"ID":"10be2947-2e91-4a8e-b54e-69cdab598955","Type":"ContainerStarted","Data":"27af68c1e8c6b927ac73e5302065bf60d6785c2334f48fb0b7e8b9eebff81e5d"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.143438 4837 patch_prober.go:28] interesting pod/console-operator-58897d9998-8dj7w container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.144806 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-8dj7w" podUID="003e8201-4e67-4356-b0c1-8cc135451069" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.153341 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" event={"ID":"255ab2ef-dead-4148-bc85-2514618767b9","Type":"ContainerStarted","Data":"d3ac728c6efb3f8a9d434e9b2db73dabc494a1a025502dc770d39636e7643e21"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.153385 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" event={"ID":"255ab2ef-dead-4148-bc85-2514618767b9","Type":"ContainerStarted","Data":"eb83a01e844ced5ebb50f5c6f4f23872dfc2208d1286a3c2f7a421eb4b88a923"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.213829 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.213871 4837 generic.go:334] "Generic (PLEG): container finished" podID="3e1f747d-78f3-4cbc-b313-eed531936c02" containerID="cc397fa1bf18472d61483cbd90123ad619e890e1c799ebf20e335d2d2900efd2" exitCode=0 Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.215097 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" event={"ID":"3e1f747d-78f3-4cbc-b313-eed531936c02","Type":"ContainerDied","Data":"cc397fa1bf18472d61483cbd90123ad619e890e1c799ebf20e335d2d2900efd2"} Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.215600 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:33.715576794 +0000 UTC m=+209.353843607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.222706 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" event={"ID":"ec4b9459-d392-4fc5-9b6f-a87ca50e85b1","Type":"ContainerStarted","Data":"ac72c30ea5814f3e17ef0ea5566cc399d618ec18408f0a194a4e0e876289f7a3"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.225940 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" event={"ID":"f5681b96-47c5-44f8-9e5d-671678930750","Type":"ContainerStarted","Data":"a4e9e8e683453ec1ec97eac55a88db48d29f9aaf6a70433eb3f631db4c7b81d9"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.278260 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" event={"ID":"10ac507b-7307-4e09-ab72-b956d0139396","Type":"ContainerStarted","Data":"87a66e2fda27d9418ac0248881f7d3de9402f85590f174cb3538e70e9955baa9"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.318422 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.321080 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:33.821060636 +0000 UTC m=+209.459327509 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.323418 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck"] Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.327450 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8zzqp"] Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.331759 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" event={"ID":"2c2663fa-7df3-4801-be78-52517eb1f1cf","Type":"ContainerStarted","Data":"27571341ae0d307e0f5fe511b98e899863223e7cd1f546bfe8d5b19e1c7422da"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.331795 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" event={"ID":"2c2663fa-7df3-4801-be78-52517eb1f1cf","Type":"ContainerStarted","Data":"87b29cac5b0d8102de47d1b2f0bf13cd354e9c6469e4cd5a1e05a6523362cce8"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.349851 4837 generic.go:334] "Generic (PLEG): container finished" podID="f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0" containerID="ca7706ee51695e703b71f0dfa955c9b51c9bd2ac8cba2d6910d4014415da7692" exitCode=0 Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.349981 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" event={"ID":"f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0","Type":"ContainerDied","Data":"ca7706ee51695e703b71f0dfa955c9b51c9bd2ac8cba2d6910d4014415da7692"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.359877 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" event={"ID":"80d5bedc-a598-4779-be24-2d512ea7d148","Type":"ContainerStarted","Data":"236230e9d88a7986fa545c821a13267c56c256d3c89adf216bffb2cfb73ae7c2"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.361617 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" event={"ID":"44f59229-dec6-4d9b-a63b-bd562b4523cf","Type":"ContainerStarted","Data":"3ef1ad2cfd596ea09a0783d40ce9413e65a0ca7227111a529229125c392c9dd0"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.362954 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t" event={"ID":"00848ba6-522a-45c7-81bd-7ab287d77626","Type":"ContainerStarted","Data":"bb3553ad20bce98adabe99a30a48b45691b06798afd9f7e7897e02cca605715d"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.363696 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" event={"ID":"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd","Type":"ContainerStarted","Data":"b69b65dfcd7e526c6965c8376e21c5fdea1ff5f7cf09fa0110e114348954b91d"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.365894 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" event={"ID":"6db10103-96be-4420-b302-a7064e347f61","Type":"ContainerStarted","Data":"34242361e539e0843f07ff6be10c070f33367f97039c925382d891dae818df9a"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.373772 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" event={"ID":"6ad2861b-4f40-4551-8aff-304359734792","Type":"ContainerStarted","Data":"ecb6f4e687ec2286b4a0231baea09dabeeb624191aa322a66433bce22ed1353d"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.373819 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" event={"ID":"6ad2861b-4f40-4551-8aff-304359734792","Type":"ContainerStarted","Data":"21bdf6a8e9acd59896f8179096077562702534a4681d2f2e339180db9df351b5"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.380438 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wcfj4" event={"ID":"e6a94afd-1f9a-4281-9d94-2fac3916f2c3","Type":"ContainerStarted","Data":"bb2d7b77a707e2a0fceb7edc70566799a53b9e335b57fb7bcf31960f63eba7da"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.391347 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx" event={"ID":"fca26784-7fdf-4923-bd07-35d182c2ad14","Type":"ContainerStarted","Data":"a0ca8c97b2911c4bf447e84741f129e0c258bce5a17e6f8304df1aff10c8aa04"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.391394 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx" event={"ID":"fca26784-7fdf-4923-bd07-35d182c2ad14","Type":"ContainerStarted","Data":"bf33e77fd4365841b90221599801ebed872e8e5e376cd3f2f3c89ea2cdd90c87"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.392809 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9g2bm" event={"ID":"b42d2c64-cd10-4923-aed0-dc586696da9a","Type":"ContainerStarted","Data":"25d408737e876ad711dada87c79742e26473e8799fa23d15fc77576a44b9ba2d"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.394631 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q2qpt" event={"ID":"c83842ec-9933-4f84-bb4a-c84ca61a28e1","Type":"ContainerStarted","Data":"c3e3e9b2ed47e2f7480af78d679ab1d816ea01c193c35244aa52793e0f02f112"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.400760 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" event={"ID":"a4a3cd73-aa6c-4128-8a5f-561719e9b170","Type":"ContainerStarted","Data":"c5767e3fcdbc5dc97c0a042c7acee915b4df773e7712c84a1e6a7c143810b3b2"} Mar 13 11:51:33 crc kubenswrapper[4837]: W0313 11:51:33.420278 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41e982da_ccd1_4b0c_9f0e_c220e06052a0.slice/crio-de0a9c7b74e03c5422dfa011bacd29880935071aff96993b9c3cf9d85694e4d2 WatchSource:0}: Error finding container de0a9c7b74e03c5422dfa011bacd29880935071aff96993b9c3cf9d85694e4d2: Status 404 returned error can't find the container with id de0a9c7b74e03c5422dfa011bacd29880935071aff96993b9c3cf9d85694e4d2 Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.420742 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.421854 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:33.921821741 +0000 UTC m=+209.560088504 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.427387 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.431301 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:33.931278407 +0000 UTC m=+209.569545170 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.492292 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.497560 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:33 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:33 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:33 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.497624 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.509967 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7"] Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.512054 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c"] Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.531183 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.532060 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:34.032044682 +0000 UTC m=+209.670311445 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.533841 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" podStartSLOduration=151.533824577 podStartE2EDuration="2m31.533824577s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:33.528049266 +0000 UTC m=+209.166316029" watchObservedRunningTime="2026-03-13 11:51:33.533824577 +0000 UTC m=+209.172091340" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.563179 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" podStartSLOduration=152.563158065 podStartE2EDuration="2m32.563158065s" podCreationTimestamp="2026-03-13 11:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:33.54830319 +0000 UTC m=+209.186569953" watchObservedRunningTime="2026-03-13 11:51:33.563158065 +0000 UTC m=+209.201424838" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.599581 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xfcxm"] Mar 13 11:51:33 crc kubenswrapper[4837]: W0313 11:51:33.607788 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode366a2cd_5dfa_45c9_b187_92772da0b827.slice/crio-92b7274c7455968a613e949633813e0ece8512f287e59aea006ff7883bef05ab WatchSource:0}: Error finding container 92b7274c7455968a613e949633813e0ece8512f287e59aea006ff7883bef05ab: Status 404 returned error can't find the container with id 92b7274c7455968a613e949633813e0ece8512f287e59aea006ff7883bef05ab Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.608409 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt"] Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.624169 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr"] Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.634392 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.634795 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:34.134784768 +0000 UTC m=+209.773051531 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.635380 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" podStartSLOduration=151.635370786 podStartE2EDuration="2m31.635370786s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:33.633838778 +0000 UTC m=+209.272105541" watchObservedRunningTime="2026-03-13 11:51:33.635370786 +0000 UTC m=+209.273637549" Mar 13 11:51:33 crc kubenswrapper[4837]: W0313 11:51:33.688414 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2960b8ba_5517_4915_b524_1f3f6d0f043c.slice/crio-1b7479fdaddb40cc2b2b5d9856e0bfd15305f77890603dcca573328b382a1ea9 WatchSource:0}: Error finding container 1b7479fdaddb40cc2b2b5d9856e0bfd15305f77890603dcca573328b382a1ea9: Status 404 returned error can't find the container with id 1b7479fdaddb40cc2b2b5d9856e0bfd15305f77890603dcca573328b382a1ea9 Mar 13 11:51:33 crc kubenswrapper[4837]: W0313 11:51:33.691175 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod831db5b2_5229_4b52_8783_f99c640ba856.slice/crio-3c294e8fd793300ee1caaf9d2068d0b14e0a9dc058385ad90bb33a7237e0e283 WatchSource:0}: Error finding container 3c294e8fd793300ee1caaf9d2068d0b14e0a9dc058385ad90bb33a7237e0e283: Status 404 returned error can't find the container with id 3c294e8fd793300ee1caaf9d2068d0b14e0a9dc058385ad90bb33a7237e0e283 Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.701747 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8vgmn"] Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.735344 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.735554 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:34.235519391 +0000 UTC m=+209.873786164 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.735776 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.736169 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:34.236153831 +0000 UTC m=+209.874420594 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.753977 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" podStartSLOduration=152.753951888 podStartE2EDuration="2m32.753951888s" podCreationTimestamp="2026-03-13 11:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:33.723770043 +0000 UTC m=+209.362036826" watchObservedRunningTime="2026-03-13 11:51:33.753951888 +0000 UTC m=+209.392218651" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.758882 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" podStartSLOduration=151.758861752 podStartE2EDuration="2m31.758861752s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:33.758551823 +0000 UTC m=+209.396818606" watchObservedRunningTime="2026-03-13 11:51:33.758861752 +0000 UTC m=+209.397128515" Mar 13 11:51:33 crc kubenswrapper[4837]: W0313 11:51:33.791710 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fb85cad_ec2d_4ada_bd68_55937d96a779.slice/crio-32fbad917c53f080dae29a17b7d2e0db3f0b48efe2df248f03fa8431da965ad3 WatchSource:0}: Error finding container 32fbad917c53f080dae29a17b7d2e0db3f0b48efe2df248f03fa8431da965ad3: Status 404 returned error can't find the container with id 32fbad917c53f080dae29a17b7d2e0db3f0b48efe2df248f03fa8431da965ad3 Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.794940 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556710-lcprh"] Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.800306 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-8dj7w" podStartSLOduration=152.800288269 podStartE2EDuration="2m32.800288269s" podCreationTimestamp="2026-03-13 11:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:33.797460691 +0000 UTC m=+209.435727454" watchObservedRunningTime="2026-03-13 11:51:33.800288269 +0000 UTC m=+209.438555052" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.812407 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9hkj4"] Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.820326 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-84xjl"] Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.824437 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-z9thp"] Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.838724 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.839135 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:34.339111124 +0000 UTC m=+209.977377887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.840451 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.841010 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:34.340997873 +0000 UTC m=+209.979264636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: W0313 11:51:33.869690 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6085cb91_fec3_45bd_bfdc_a10e6043049f.slice/crio-aa9b8655d804dd95c7ef91fad04e5b946fd2097131e680a82470cf7f2089113c WatchSource:0}: Error finding container aa9b8655d804dd95c7ef91fad04e5b946fd2097131e680a82470cf7f2089113c: Status 404 returned error can't find the container with id aa9b8655d804dd95c7ef91fad04e5b946fd2097131e680a82470cf7f2089113c Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.874496 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.877817 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-9tkxg" podStartSLOduration=151.877802656 podStartE2EDuration="2m31.877802656s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:33.877605189 +0000 UTC m=+209.515871972" watchObservedRunningTime="2026-03-13 11:51:33.877802656 +0000 UTC m=+209.516069419" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.941938 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.942199 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:34.44216493 +0000 UTC m=+210.080431693 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.955970 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-q2qpt" podStartSLOduration=151.955945972 podStartE2EDuration="2m31.955945972s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:33.95044979 +0000 UTC m=+209.588716563" watchObservedRunningTime="2026-03-13 11:51:33.955945972 +0000 UTC m=+209.594212735" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.047186 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:34 crc kubenswrapper[4837]: E0313 11:51:34.049687 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:34.549672937 +0000 UTC m=+210.187939690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.051520 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" podStartSLOduration=152.051499453 podStartE2EDuration="2m32.051499453s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.045843626 +0000 UTC m=+209.684110389" watchObservedRunningTime="2026-03-13 11:51:34.051499453 +0000 UTC m=+209.689766216" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.121334 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" podStartSLOduration=153.121315569 podStartE2EDuration="2m33.121315569s" podCreationTimestamp="2026-03-13 11:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.093817568 +0000 UTC m=+209.732084331" watchObservedRunningTime="2026-03-13 11:51:34.121315569 +0000 UTC m=+209.759582332" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.122799 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" podStartSLOduration=152.122786575 podStartE2EDuration="2m32.122786575s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.119401469 +0000 UTC m=+209.757668232" watchObservedRunningTime="2026-03-13 11:51:34.122786575 +0000 UTC m=+209.761053338" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.154351 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:34 crc kubenswrapper[4837]: E0313 11:51:34.154817 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:34.654795767 +0000 UTC m=+210.293062530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.256222 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:34 crc kubenswrapper[4837]: E0313 11:51:34.256780 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:34.75676468 +0000 UTC m=+210.395031453 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.357088 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:34 crc kubenswrapper[4837]: E0313 11:51:34.357453 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:34.857439502 +0000 UTC m=+210.495706265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.415332 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" event={"ID":"a4a3cd73-aa6c-4128-8a5f-561719e9b170","Type":"ContainerStarted","Data":"336c92baa53a146263254fd491124e611101140747c03155c6b5a6a9c70b55c2"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.422748 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" event={"ID":"998432c5-238a-466a-a779-7d5126210706","Type":"ContainerStarted","Data":"7a69b7c6b089175a7cc7fc1cba2ac9b38104e05673fb7ba5b0c410bd95c86b31"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.441441 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-64xpb" event={"ID":"fe42bd29-b8a7-4a9f-89e2-ab3b944d7c26","Type":"ContainerStarted","Data":"04c8c046f4062a12915ce1bc97ea7ac2245aedb416cfc34c07574367a17e5c75"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.442681 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" podStartSLOduration=152.442662929 podStartE2EDuration="2m32.442662929s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.441160612 +0000 UTC m=+210.079427385" watchObservedRunningTime="2026-03-13 11:51:34.442662929 +0000 UTC m=+210.080929692" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.460244 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:34 crc kubenswrapper[4837]: E0313 11:51:34.460851 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:34.960815008 +0000 UTC m=+210.599081771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.468868 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-64xpb" podStartSLOduration=152.468848939 podStartE2EDuration="2m32.468848939s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.467697583 +0000 UTC m=+210.105964346" watchObservedRunningTime="2026-03-13 11:51:34.468848939 +0000 UTC m=+210.107115712" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.475473 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-84xjl" event={"ID":"8bc71239-c925-4911-bfa5-e7a564dcd654","Type":"ContainerStarted","Data":"93a423c2f3f7c65543caac4a0f22a811ce2eabd4f3aefb38b12ee8b804d63fae"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.482821 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" event={"ID":"f5681b96-47c5-44f8-9e5d-671678930750","Type":"ContainerStarted","Data":"53d534af6f8df34ececfb1a8a0e6c5ad8f272dcb9c009a461b5b14acbfd167b8"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.482874 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" event={"ID":"f5681b96-47c5-44f8-9e5d-671678930750","Type":"ContainerStarted","Data":"6f263bafe32aba6747375cfb633f62ecda193eb68ec17190dd9cca7b969ae95a"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.483905 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.491058 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" event={"ID":"f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0","Type":"ContainerStarted","Data":"9d9e796c53101ac91bc5573107aa81ef153174f5c3a3c3ef62f853513d212d80"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.500116 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.503977 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:34 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:34 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:34 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.504037 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.504379 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" event={"ID":"831db5b2-5229-4b52-8783-f99c640ba856","Type":"ContainerStarted","Data":"965aad43c7ccd189d4d18246f935c745fc24b5e2cfb5b07896f9492e9109fb55"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.504419 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" event={"ID":"831db5b2-5229-4b52-8783-f99c640ba856","Type":"ContainerStarted","Data":"3c294e8fd793300ee1caaf9d2068d0b14e0a9dc058385ad90bb33a7237e0e283"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.510909 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" event={"ID":"8fb85cad-ec2d-4ada-bd68-55937d96a779","Type":"ContainerStarted","Data":"32fbad917c53f080dae29a17b7d2e0db3f0b48efe2df248f03fa8431da965ad3"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.512831 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" podStartSLOduration=152.512809675 podStartE2EDuration="2m32.512809675s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.51134662 +0000 UTC m=+210.149613403" watchObservedRunningTime="2026-03-13 11:51:34.512809675 +0000 UTC m=+210.151076438" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.528729 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" event={"ID":"d0005e35-a11c-4773-a0d1-94fa4aff8a14","Type":"ContainerStarted","Data":"81c8f78518697dd119d66780b63d9f72316e68c2f8e4ec6dd363b91e11730a57"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.528775 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" event={"ID":"d0005e35-a11c-4773-a0d1-94fa4aff8a14","Type":"ContainerStarted","Data":"1600b61f7d9a501c5cb5dec3d449565dc9734994d5148c31b0f65a016a53ce24"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.528965 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.531003 4837 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-xhx6c container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.531088 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" podUID="d0005e35-a11c-4773-a0d1-94fa4aff8a14" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.537423 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" podStartSLOduration=153.53726176 podStartE2EDuration="2m33.53726176s" podCreationTimestamp="2026-03-13 11:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.531217402 +0000 UTC m=+210.169484185" watchObservedRunningTime="2026-03-13 11:51:34.53726176 +0000 UTC m=+210.175528513" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.557135 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" podStartSLOduration=153.557096251 podStartE2EDuration="2m33.557096251s" podCreationTimestamp="2026-03-13 11:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.555984587 +0000 UTC m=+210.194251350" watchObservedRunningTime="2026-03-13 11:51:34.557096251 +0000 UTC m=+210.195363004" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.559588 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" event={"ID":"10ac507b-7307-4e09-ab72-b956d0139396","Type":"ContainerStarted","Data":"e62899fe641e7552245bdb1105a06db5016cb36ce949fad6221d2c72a4fdec51"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.560796 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:34 crc kubenswrapper[4837]: E0313 11:51:34.562839 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:35.06281122 +0000 UTC m=+210.701078153 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.563825 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" event={"ID":"e366a2cd-5dfa-45c9-b187-92772da0b827","Type":"ContainerStarted","Data":"2bf3a279693bfd2239b0811b510f18eec4acf69529f2534645ac054c12ce5663"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.563900 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" event={"ID":"e366a2cd-5dfa-45c9-b187-92772da0b827","Type":"ContainerStarted","Data":"92b7274c7455968a613e949633813e0ece8512f287e59aea006ff7883bef05ab"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.565137 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.571506 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" event={"ID":"6db10103-96be-4420-b302-a7064e347f61","Type":"ContainerStarted","Data":"ad11424fde61443cfda2afa459aaadcdeb2d287845e06731b5a12889a60c35c7"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.573745 4837 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-659h7 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.573824 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" podUID="e366a2cd-5dfa-45c9-b187-92772da0b827" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.577531 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" podStartSLOduration=152.577507441 podStartE2EDuration="2m32.577507441s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.576160009 +0000 UTC m=+210.214426772" watchObservedRunningTime="2026-03-13 11:51:34.577507441 +0000 UTC m=+210.215774204" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.593162 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx" event={"ID":"fca26784-7fdf-4923-bd07-35d182c2ad14","Type":"ContainerStarted","Data":"5607a7fb040fc09d5c18959f6f3f38a09ca7b1955d2c087bf5a6429e6ba86758"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.620975 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" event={"ID":"80d5bedc-a598-4779-be24-2d512ea7d148","Type":"ContainerStarted","Data":"67246c316b4f89ce453ed8c939d7b6959e5673d7d07e1b3189a0364926d8c36c"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.621034 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" event={"ID":"80d5bedc-a598-4779-be24-2d512ea7d148","Type":"ContainerStarted","Data":"5cf6fb6c12775c391cb2badc48b232d0fec1d099936960e591a7481d2edd85ca"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.623573 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556710-lcprh" event={"ID":"0484d991-f239-47a2-80ff-0237945c27ac","Type":"ContainerStarted","Data":"960f7af1fa61c8ed012820a8878b593f9924c583dd0d3076ea82e4ba9452a14b"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.630260 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" podStartSLOduration=152.630239271 podStartE2EDuration="2m32.630239271s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.624678107 +0000 UTC m=+210.262944890" watchObservedRunningTime="2026-03-13 11:51:34.630239271 +0000 UTC m=+210.268506034" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.630746 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" podStartSLOduration=153.630740237 podStartE2EDuration="2m33.630740237s" podCreationTimestamp="2026-03-13 11:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.603856136 +0000 UTC m=+210.242122899" watchObservedRunningTime="2026-03-13 11:51:34.630740237 +0000 UTC m=+210.269006990" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.641364 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" event={"ID":"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57","Type":"ContainerStarted","Data":"164f9219a9b284091a35c7220447c2c7b0bed7f5727c7a47455b22924bfd7017"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.641416 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" event={"ID":"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57","Type":"ContainerStarted","Data":"26ab5d52a299f8f16150b097ca72ca2f2e9edeabd8750e2fd4e382c8b4e868a8"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.645057 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" event={"ID":"44f59229-dec6-4d9b-a63b-bd562b4523cf","Type":"ContainerStarted","Data":"8078318f2e84a807e81a2aab4ef9d643f99f9cbb8a870017c43050909154a8c8"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.645108 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" event={"ID":"44f59229-dec6-4d9b-a63b-bd562b4523cf","Type":"ContainerStarted","Data":"7b919fbe145de00ddd31c2e9c36d139a2ba1f597228ad3aacec8b12b44c0bd55"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.650489 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" event={"ID":"10be2947-2e91-4a8e-b54e-69cdab598955","Type":"ContainerStarted","Data":"844116c251951dc948531c150544bded3b51d4b0a39ec1479d3741e54fcffa22"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.656026 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8zzqp" event={"ID":"416fd214-ef6d-45b4-bf11-a35c92909523","Type":"ContainerStarted","Data":"5eb9cb8c84060b639aeb979d7a717fec41c33810120564ed7c88b7d4c8a36b76"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.656075 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8zzqp" event={"ID":"416fd214-ef6d-45b4-bf11-a35c92909523","Type":"ContainerStarted","Data":"e755c6302f2eb2419c8060998ee9930c1d704371a17e3337b1dbc4d6d0f0fbf2"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.656211 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" podStartSLOduration=152.656192674 podStartE2EDuration="2m32.656192674s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.654504482 +0000 UTC m=+210.292771245" watchObservedRunningTime="2026-03-13 11:51:34.656192674 +0000 UTC m=+210.294459437" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.663801 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:34 crc kubenswrapper[4837]: E0313 11:51:34.664779 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:35.164767553 +0000 UTC m=+210.803034316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.665035 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wcfj4" event={"ID":"e6a94afd-1f9a-4281-9d94-2fac3916f2c3","Type":"ContainerStarted","Data":"499553b71cc7f3fa078528578afd4bc0d7fd23462f79f4bcc3fbaa571428442f"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.665088 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wcfj4" event={"ID":"e6a94afd-1f9a-4281-9d94-2fac3916f2c3","Type":"ContainerStarted","Data":"19a484d698e8d3794e1c4a936f5217201dc3ab68057937694742bd30ea81f214"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.699548 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8ktsx" event={"ID":"85ac6950-8b98-4d0c-8a2b-7eeeac8d1435","Type":"ContainerStarted","Data":"aea88127f304b60e92e6e3bfd6b308c34d21f2a1163ada78a267b7d02277f97d"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.699615 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8ktsx" event={"ID":"85ac6950-8b98-4d0c-8a2b-7eeeac8d1435","Type":"ContainerStarted","Data":"e010eb156dd5e54ff5b67933f30569e7dfedf9e8ebf856adbf791bedb5ca7007"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.700745 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-8ktsx" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.711091 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9g2bm" event={"ID":"b42d2c64-cd10-4923-aed0-dc586696da9a","Type":"ContainerStarted","Data":"13687571c1d08b467659e2286548ed0f122cf7da33338d2df258402f1288580d"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.717613 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx" podStartSLOduration=153.717590756 podStartE2EDuration="2m33.717590756s" podCreationTimestamp="2026-03-13 11:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.715043867 +0000 UTC m=+210.353310630" watchObservedRunningTime="2026-03-13 11:51:34.717590756 +0000 UTC m=+210.355857519" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.723991 4837 patch_prober.go:28] interesting pod/downloads-7954f5f757-8ktsx container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.724057 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8ktsx" podUID="85ac6950-8b98-4d0c-8a2b-7eeeac8d1435" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.741857 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" event={"ID":"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd","Type":"ContainerStarted","Data":"9051790ee9843afb035fb7211a009d5253125f571a3e0be3940d0df74643b8c8"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.766453 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:34 crc kubenswrapper[4837]: E0313 11:51:34.767995 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:35.267967233 +0000 UTC m=+210.906234006 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.782137 4837 generic.go:334] "Generic (PLEG): container finished" podID="ffb5553f-d2d5-4584-9bf8-7212a378f358" containerID="f370af6d9bb9be9e1462cd46c0419f4c6f0c3a54cbe69a7101da09b322008a64" exitCode=0 Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.782258 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" event={"ID":"ffb5553f-d2d5-4584-9bf8-7212a378f358","Type":"ContainerDied","Data":"f370af6d9bb9be9e1462cd46c0419f4c6f0c3a54cbe69a7101da09b322008a64"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.792783 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" event={"ID":"41e982da-ccd1-4b0c-9f0e-c220e06052a0","Type":"ContainerStarted","Data":"70676082def244677edb5580775204c77414b6bc522caa2f0805cd1405f61587"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.792837 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" event={"ID":"41e982da-ccd1-4b0c-9f0e-c220e06052a0","Type":"ContainerStarted","Data":"de0a9c7b74e03c5422dfa011bacd29880935071aff96993b9c3cf9d85694e4d2"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.794147 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.802540 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9hkj4" event={"ID":"90308f63-bacc-491b-9ce2-ffbb2eaaea1f","Type":"ContainerStarted","Data":"ec5b1f024551089ef0a7ec39c81b1af86808e9f3a3709026bb866788e9c3043e"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.816831 4837 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-4gsck container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.816910 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" podUID="41e982da-ccd1-4b0c-9f0e-c220e06052a0" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.817900 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" podStartSLOduration=152.817885376 podStartE2EDuration="2m32.817885376s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.81705036 +0000 UTC m=+210.455317123" watchObservedRunningTime="2026-03-13 11:51:34.817885376 +0000 UTC m=+210.456152139" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.818882 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z9thp" event={"ID":"6085cb91-fec3-45bd-bfdc-a10e6043049f","Type":"ContainerStarted","Data":"aa9b8655d804dd95c7ef91fad04e5b946fd2097131e680a82470cf7f2089113c"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.819309 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-9g2bm" podStartSLOduration=5.819298441 podStartE2EDuration="5.819298441s" podCreationTimestamp="2026-03-13 11:51:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.753709227 +0000 UTC m=+210.391976020" watchObservedRunningTime="2026-03-13 11:51:34.819298441 +0000 UTC m=+210.457565224" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.830774 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" event={"ID":"9025cb05-7c57-488b-a8cb-441552547aae","Type":"ContainerStarted","Data":"6221e47c7d92fba80529d8c67748e95db883e3bfd59b0c1395e15b0b1c63df79"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.862815 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" podStartSLOduration=152.862790422 podStartE2EDuration="2m32.862790422s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.860047376 +0000 UTC m=+210.498314139" watchObservedRunningTime="2026-03-13 11:51:34.862790422 +0000 UTC m=+210.501057185" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.863155 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" event={"ID":"ec4b9459-d392-4fc5-9b6f-a87ca50e85b1","Type":"ContainerStarted","Data":"afa74a59bf362e1526eaf8019370b76f31cc88294794a75fa2e16f11441e11d9"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.868532 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:34 crc kubenswrapper[4837]: E0313 11:51:34.887699 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:35.387680911 +0000 UTC m=+211.025947664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.909615 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" event={"ID":"2960b8ba-5517-4915-b524-1f3f6d0f043c","Type":"ContainerStarted","Data":"1b7479fdaddb40cc2b2b5d9856e0bfd15305f77890603dcca573328b382a1ea9"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.940754 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t" event={"ID":"00848ba6-522a-45c7-81bd-7ab287d77626","Type":"ContainerStarted","Data":"c171f1f4f12cca2eb0fc64dbb462cae8fdfab2815e450b48a43b5af2e0b3f556"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.947598 4837 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-9dbhc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.947675 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" podUID="f8bc408a-bca6-42ff-8572-2ba9a3978682" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.949444 4837 patch_prober.go:28] interesting pod/console-operator-58897d9998-8dj7w container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.949524 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-8dj7w" podUID="003e8201-4e67-4356-b0c1-8cc135451069" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.949598 4837 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-8q6j6 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.949802 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" podUID="27d45de2-e0ab-4c3e-b3da-b20e60e26801" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.956984 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.960985 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-8ktsx" podStartSLOduration=152.960966845 podStartE2EDuration="2m32.960966845s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.927503028 +0000 UTC m=+210.565769801" watchObservedRunningTime="2026-03-13 11:51:34.960966845 +0000 UTC m=+210.599233608" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.964846 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-wcfj4" podStartSLOduration=152.964821206 podStartE2EDuration="2m32.964821206s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.960345266 +0000 UTC m=+210.598612029" watchObservedRunningTime="2026-03-13 11:51:34.964821206 +0000 UTC m=+210.603087969" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.975214 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:34 crc kubenswrapper[4837]: E0313 11:51:34.976553 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:35.476532673 +0000 UTC m=+211.114799496 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:34.999066 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" podStartSLOduration=152.999043588 podStartE2EDuration="2m32.999043588s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.996595871 +0000 UTC m=+210.634862644" watchObservedRunningTime="2026-03-13 11:51:34.999043588 +0000 UTC m=+210.637310341" Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.073988 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" podStartSLOduration=153.073968753 podStartE2EDuration="2m33.073968753s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:35.072808657 +0000 UTC m=+210.711075430" watchObservedRunningTime="2026-03-13 11:51:35.073968753 +0000 UTC m=+210.712235516" Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.078046 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:35 crc kubenswrapper[4837]: E0313 11:51:35.078405 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:35.578390731 +0000 UTC m=+211.216657504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.100767 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t" podStartSLOduration=153.100751111 podStartE2EDuration="2m33.100751111s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:35.100247206 +0000 UTC m=+210.738513969" watchObservedRunningTime="2026-03-13 11:51:35.100751111 +0000 UTC m=+210.739017875" Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.129298 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" podStartSLOduration=153.129280065 podStartE2EDuration="2m33.129280065s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:35.128250942 +0000 UTC m=+210.766517715" watchObservedRunningTime="2026-03-13 11:51:35.129280065 +0000 UTC m=+210.767546828" Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.159787 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" podStartSLOduration=153.159769499 podStartE2EDuration="2m33.159769499s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:35.156012111 +0000 UTC m=+210.794278884" watchObservedRunningTime="2026-03-13 11:51:35.159769499 +0000 UTC m=+210.798036272" Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.180735 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:35 crc kubenswrapper[4837]: E0313 11:51:35.180853 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:35.680832979 +0000 UTC m=+211.319099752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.181277 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:35 crc kubenswrapper[4837]: E0313 11:51:35.181711 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:35.681701726 +0000 UTC m=+211.319968489 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.210413 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9hkj4" podStartSLOduration=6.210394764 podStartE2EDuration="6.210394764s" podCreationTimestamp="2026-03-13 11:51:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:35.17579164 +0000 UTC m=+210.814058403" watchObservedRunningTime="2026-03-13 11:51:35.210394764 +0000 UTC m=+210.848661527" Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.252673 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" podStartSLOduration=153.252656097 podStartE2EDuration="2m33.252656097s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:35.221323866 +0000 UTC m=+210.859590639" watchObservedRunningTime="2026-03-13 11:51:35.252656097 +0000 UTC m=+210.890922860" Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.287295 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:35 crc kubenswrapper[4837]: E0313 11:51:35.287657 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:35.787628992 +0000 UTC m=+211.425895755 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.289567 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" podStartSLOduration=153.289542462 podStartE2EDuration="2m33.289542462s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:35.253976788 +0000 UTC m=+210.892243571" watchObservedRunningTime="2026-03-13 11:51:35.289542462 +0000 UTC m=+210.927809225" Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.392330 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:35 crc kubenswrapper[4837]: E0313 11:51:35.392793 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:35.892778474 +0000 UTC m=+211.531045237 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.488145 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.488434 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.494424 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:35 crc kubenswrapper[4837]: E0313 11:51:35.505541 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:36.005522264 +0000 UTC m=+211.643789027 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.510829 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:35 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:35 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:35 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.510912 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.606871 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.607257 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:35 crc kubenswrapper[4837]: E0313 11:51:35.607552 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:36.107538967 +0000 UTC m=+211.745805730 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.711475 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:35 crc kubenswrapper[4837]: E0313 11:51:35.712041 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:36.212021358 +0000 UTC m=+211.850288121 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.712318 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:35 crc kubenswrapper[4837]: E0313 11:51:35.712563 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:36.212557255 +0000 UTC m=+211.850824008 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.815233 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:35 crc kubenswrapper[4837]: E0313 11:51:35.815538 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:36.315520199 +0000 UTC m=+211.953786972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.815694 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:35 crc kubenswrapper[4837]: E0313 11:51:35.815955 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:36.315946642 +0000 UTC m=+211.954213485 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.916892 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:35 crc kubenswrapper[4837]: E0313 11:51:35.917378 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:36.417362486 +0000 UTC m=+212.055629249 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.995847 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" event={"ID":"998432c5-238a-466a-a779-7d5126210706","Type":"ContainerStarted","Data":"17bcf3017106794b154977fad9d2c812796ccc0d0ef29808bf462a133588ee4f"} Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.024694 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.025164 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:36.525145291 +0000 UTC m=+212.163412144 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.045764 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-84xjl" event={"ID":"8bc71239-c925-4911-bfa5-e7a564dcd654","Type":"ContainerStarted","Data":"055d1ece3e453d5cd8ff1c8063ecb17ee0490ae16e5ed5fdb15f70404bb4569d"} Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.064522 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9hkj4" event={"ID":"90308f63-bacc-491b-9ce2-ffbb2eaaea1f","Type":"ContainerStarted","Data":"4dc188b2c41b6fea37bc31958d827dc56918d3204bb89c56485df4a2a5ee352a"} Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.091243 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" event={"ID":"2960b8ba-5517-4915-b524-1f3f6d0f043c","Type":"ContainerStarted","Data":"43fc2231f05867d05d847de1bd4090131ff52020558ef46e3913e63609a95979"} Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.100006 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8zzqp" event={"ID":"416fd214-ef6d-45b4-bf11-a35c92909523","Type":"ContainerStarted","Data":"251642b3e74397ace0a9f24d9c53652b9f098294fc8b460805946e095b03ca59"} Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.112383 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" podStartSLOduration=154.112366372 podStartE2EDuration="2m34.112366372s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:36.112168716 +0000 UTC m=+211.750435479" watchObservedRunningTime="2026-03-13 11:51:36.112366372 +0000 UTC m=+211.750633135" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.117678 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" event={"ID":"8fb85cad-ec2d-4ada-bd68-55937d96a779","Type":"ContainerStarted","Data":"7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090"} Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.117904 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.119215 4837 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8vgmn container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.119259 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" podUID="8fb85cad-ec2d-4ada-bd68-55937d96a779" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.126123 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.127010 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:36.626974779 +0000 UTC m=+212.265241542 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.127725 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" event={"ID":"3e1f747d-78f3-4cbc-b313-eed531936c02","Type":"ContainerStarted","Data":"b2855d588ed711a2a0163b9bf580169f1f7ec427da32756423d44ca38e6cb5be"} Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.127771 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" event={"ID":"3e1f747d-78f3-4cbc-b313-eed531936c02","Type":"ContainerStarted","Data":"cc407b0d1ace08a36e1a88fb7b0359d00f5ca27055c133bc6c09e4fad8dddebf"} Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.138803 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z9thp" event={"ID":"6085cb91-fec3-45bd-bfdc-a10e6043049f","Type":"ContainerStarted","Data":"57e34defc005454facec5cabd8aafb289d26d833eaed6b9d8732c0557b13c1f5"} Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.138850 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z9thp" event={"ID":"6085cb91-fec3-45bd-bfdc-a10e6043049f","Type":"ContainerStarted","Data":"87a261aea7c43882c29f16625a2b4dd46997bbc086bc97384be02d61ee5a8595"} Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.139409 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.142013 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" event={"ID":"ffb5553f-d2d5-4584-9bf8-7212a378f358","Type":"ContainerStarted","Data":"4339d1e1fecf820037c6a5425b00f03c18da3f1bea62317e129879816d68eb07"} Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.149323 4837 patch_prober.go:28] interesting pod/downloads-7954f5f757-8ktsx container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.149376 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8ktsx" podUID="85ac6950-8b98-4d0c-8a2b-7eeeac8d1435" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.149460 4837 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-659h7 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.149481 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" podUID="e366a2cd-5dfa-45c9-b187-92772da0b827" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.149524 4837 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-xhx6c container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.149577 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" podUID="d0005e35-a11c-4773-a0d1-94fa4aff8a14" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.160364 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.229999 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.233558 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:36.733547596 +0000 UTC m=+212.371814359 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.255225 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-8zzqp" podStartSLOduration=154.255202053 podStartE2EDuration="2m34.255202053s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:36.208658156 +0000 UTC m=+211.846924929" watchObservedRunningTime="2026-03-13 11:51:36.255202053 +0000 UTC m=+211.893468836" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.255422 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" podStartSLOduration=155.25541633 podStartE2EDuration="2m35.25541633s" podCreationTimestamp="2026-03-13 11:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:36.249541026 +0000 UTC m=+211.887807809" watchObservedRunningTime="2026-03-13 11:51:36.25541633 +0000 UTC m=+211.893683083" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.277102 4837 ???:1] "http: TLS handshake error from 192.168.126.11:42380: no serving certificate available for the kubelet" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.337900 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.344744 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:36.838582344 +0000 UTC m=+212.476849107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.361824 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-z9thp" podStartSLOduration=7.361808031 podStartE2EDuration="7.361808031s" podCreationTimestamp="2026-03-13 11:51:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:36.325194735 +0000 UTC m=+211.963461508" watchObservedRunningTime="2026-03-13 11:51:36.361808031 +0000 UTC m=+212.000074794" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.402707 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" podStartSLOduration=154.402689471 podStartE2EDuration="2m34.402689471s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:36.401110041 +0000 UTC m=+212.039376814" watchObservedRunningTime="2026-03-13 11:51:36.402689471 +0000 UTC m=+212.040956234" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.403369 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" podStartSLOduration=154.403363592 podStartE2EDuration="2m34.403363592s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:36.362132471 +0000 UTC m=+212.000399234" watchObservedRunningTime="2026-03-13 11:51:36.403363592 +0000 UTC m=+212.041630355" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.418910 4837 ???:1] "http: TLS handshake error from 192.168.126.11:42394: no serving certificate available for the kubelet" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.440136 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.440429 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:36.940416602 +0000 UTC m=+212.578683365 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.497396 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:36 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:36 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:36 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.497462 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.541957 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.542216 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.042171957 +0000 UTC m=+212.680438720 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.571690 4837 ???:1] "http: TLS handshake error from 192.168.126.11:42408: no serving certificate available for the kubelet" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.643815 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.644214 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.144191721 +0000 UTC m=+212.782458564 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.684820 4837 ???:1] "http: TLS handshake error from 192.168.126.11:42424: no serving certificate available for the kubelet" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.744870 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.745235 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.245207404 +0000 UTC m=+212.883474167 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.745325 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.745673 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.245660308 +0000 UTC m=+212.883927071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.814802 4837 ???:1] "http: TLS handshake error from 192.168.126.11:42432: no serving certificate available for the kubelet" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.846575 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.846735 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.346707942 +0000 UTC m=+212.984974705 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.846879 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.847173 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.347166685 +0000 UTC m=+212.985433448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.947838 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.948038 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.448007442 +0000 UTC m=+213.086274215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.948080 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.948445 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.448433386 +0000 UTC m=+213.086700209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.005358 4837 ???:1] "http: TLS handshake error from 192.168.126.11:42442: no serving certificate available for the kubelet" Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.048754 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.048957 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.548924802 +0000 UTC m=+213.187191625 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.049108 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.049473 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.549462819 +0000 UTC m=+213.187729652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.150910 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.151469 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.651451762 +0000 UTC m=+213.289718535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.152173 4837 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8vgmn container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.152237 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" podUID="8fb85cad-ec2d-4ada-bd68-55937d96a779" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.152598 4837 patch_prober.go:28] interesting pod/downloads-7954f5f757-8ktsx container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.152623 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8ktsx" podUID="85ac6950-8b98-4d0c-8a2b-7eeeac8d1435" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.160173 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.230835 4837 ???:1] "http: TLS handshake error from 192.168.126.11:42452: no serving certificate available for the kubelet" Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.252948 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.254438 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.754420315 +0000 UTC m=+213.392687128 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.354267 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.354601 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.85456846 +0000 UTC m=+213.492835223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.354899 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.355324 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.855314214 +0000 UTC m=+213.493580977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.456452 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.456666 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.956625336 +0000 UTC m=+213.594892099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.456862 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.457216 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.957209503 +0000 UTC m=+213.595476266 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.494995 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:37 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:37 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:37 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.495053 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.558432 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.558793 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:38.058776893 +0000 UTC m=+213.697043656 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.659732 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.660011 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:38.159999552 +0000 UTC m=+213.798266315 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.714890 4837 ???:1] "http: TLS handshake error from 192.168.126.11:42456: no serving certificate available for the kubelet" Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.762704 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.762886 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:38.262855232 +0000 UTC m=+213.901122005 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.763184 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.763572 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:38.263559965 +0000 UTC m=+213.901826798 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.863798 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.864208 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:38.364189835 +0000 UTC m=+214.002456608 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.965401 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.965945 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:38.465900969 +0000 UTC m=+214.104167782 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.998526 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-twtbj"] Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.999407 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.004120 4837 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-f97pg container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.004170 4837 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-f97pg container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.004195 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" podUID="f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.004245 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" podUID="f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.005079 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.034421 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-twtbj"] Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.066858 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.067177 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/278c91cc-2624-42cd-a35e-287e22d22f7d-utilities\") pod \"community-operators-twtbj\" (UID: \"278c91cc-2624-42cd-a35e-287e22d22f7d\") " pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.067203 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcgkl\" (UniqueName: \"kubernetes.io/projected/278c91cc-2624-42cd-a35e-287e22d22f7d-kube-api-access-bcgkl\") pod \"community-operators-twtbj\" (UID: \"278c91cc-2624-42cd-a35e-287e22d22f7d\") " pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.067244 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/278c91cc-2624-42cd-a35e-287e22d22f7d-catalog-content\") pod \"community-operators-twtbj\" (UID: \"278c91cc-2624-42cd-a35e-287e22d22f7d\") " pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:51:38 crc kubenswrapper[4837]: E0313 11:51:38.067347 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:38.567332095 +0000 UTC m=+214.205598858 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.168858 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/278c91cc-2624-42cd-a35e-287e22d22f7d-utilities\") pod \"community-operators-twtbj\" (UID: \"278c91cc-2624-42cd-a35e-287e22d22f7d\") " pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.168922 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcgkl\" (UniqueName: \"kubernetes.io/projected/278c91cc-2624-42cd-a35e-287e22d22f7d-kube-api-access-bcgkl\") pod \"community-operators-twtbj\" (UID: \"278c91cc-2624-42cd-a35e-287e22d22f7d\") " pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.168988 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/278c91cc-2624-42cd-a35e-287e22d22f7d-catalog-content\") pod \"community-operators-twtbj\" (UID: \"278c91cc-2624-42cd-a35e-287e22d22f7d\") " pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.169067 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:38 crc kubenswrapper[4837]: E0313 11:51:38.169704 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:38.669688089 +0000 UTC m=+214.307954862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.169714 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/278c91cc-2624-42cd-a35e-287e22d22f7d-catalog-content\") pod \"community-operators-twtbj\" (UID: \"278c91cc-2624-42cd-a35e-287e22d22f7d\") " pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.169789 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/278c91cc-2624-42cd-a35e-287e22d22f7d-utilities\") pod \"community-operators-twtbj\" (UID: \"278c91cc-2624-42cd-a35e-287e22d22f7d\") " pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.175266 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-84xjl" event={"ID":"8bc71239-c925-4911-bfa5-e7a564dcd654","Type":"ContainerStarted","Data":"84079e4e10d6acaf8229cfb3ae643344c68afe85070cfbbb2e35088762c2fa76"} Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.181902 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ft6cr"] Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.192460 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.203485 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.216925 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ft6cr"] Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.223406 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcgkl\" (UniqueName: \"kubernetes.io/projected/278c91cc-2624-42cd-a35e-287e22d22f7d-kube-api-access-bcgkl\") pod \"community-operators-twtbj\" (UID: \"278c91cc-2624-42cd-a35e-287e22d22f7d\") " pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.245795 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.250319 4837 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.251939 4837 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-13T11:51:38.250342734Z","Handler":null,"Name":""} Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.272211 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.272544 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6060cf2-077e-4112-af57-f100e297f320-utilities\") pod \"certified-operators-ft6cr\" (UID: \"e6060cf2-077e-4112-af57-f100e297f320\") " pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.272682 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6060cf2-077e-4112-af57-f100e297f320-catalog-content\") pod \"certified-operators-ft6cr\" (UID: \"e6060cf2-077e-4112-af57-f100e297f320\") " pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.272788 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfvgs\" (UniqueName: \"kubernetes.io/projected/e6060cf2-077e-4112-af57-f100e297f320-kube-api-access-gfvgs\") pod \"certified-operators-ft6cr\" (UID: \"e6060cf2-077e-4112-af57-f100e297f320\") " pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:51:38 crc kubenswrapper[4837]: E0313 11:51:38.273644 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:38.773604782 +0000 UTC m=+214.411871545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.308150 4837 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.308185 4837 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.320098 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.373749 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6060cf2-077e-4112-af57-f100e297f320-utilities\") pod \"certified-operators-ft6cr\" (UID: \"e6060cf2-077e-4112-af57-f100e297f320\") " pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.373798 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.373862 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6060cf2-077e-4112-af57-f100e297f320-catalog-content\") pod \"certified-operators-ft6cr\" (UID: \"e6060cf2-077e-4112-af57-f100e297f320\") " pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.373966 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfvgs\" (UniqueName: \"kubernetes.io/projected/e6060cf2-077e-4112-af57-f100e297f320-kube-api-access-gfvgs\") pod \"certified-operators-ft6cr\" (UID: \"e6060cf2-077e-4112-af57-f100e297f320\") " pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.374706 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6060cf2-077e-4112-af57-f100e297f320-catalog-content\") pod \"certified-operators-ft6cr\" (UID: \"e6060cf2-077e-4112-af57-f100e297f320\") " pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.374721 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6060cf2-077e-4112-af57-f100e297f320-utilities\") pod \"certified-operators-ft6cr\" (UID: \"e6060cf2-077e-4112-af57-f100e297f320\") " pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.378263 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vx4r8"] Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.379228 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.403214 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfvgs\" (UniqueName: \"kubernetes.io/projected/e6060cf2-077e-4112-af57-f100e297f320-kube-api-access-gfvgs\") pod \"certified-operators-ft6cr\" (UID: \"e6060cf2-077e-4112-af57-f100e297f320\") " pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.403732 4837 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.403768 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.405415 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vx4r8"] Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.475312 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45e6ae52-59ef-446f-917a-549d34ffbf8e-catalog-content\") pod \"community-operators-vx4r8\" (UID: \"45e6ae52-59ef-446f-917a-549d34ffbf8e\") " pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.475764 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45e6ae52-59ef-446f-917a-549d34ffbf8e-utilities\") pod \"community-operators-vx4r8\" (UID: \"45e6ae52-59ef-446f-917a-549d34ffbf8e\") " pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.475830 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmfv2\" (UniqueName: \"kubernetes.io/projected/45e6ae52-59ef-446f-917a-549d34ffbf8e-kube-api-access-xmfv2\") pod \"community-operators-vx4r8\" (UID: \"45e6ae52-59ef-446f-917a-549d34ffbf8e\") " pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.507173 4837 ???:1] "http: TLS handshake error from 192.168.126.11:42468: no serving certificate available for the kubelet" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.517202 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:38 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:38 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:38 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.517253 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.544471 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.579358 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45e6ae52-59ef-446f-917a-549d34ffbf8e-catalog-content\") pod \"community-operators-vx4r8\" (UID: \"45e6ae52-59ef-446f-917a-549d34ffbf8e\") " pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.579422 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45e6ae52-59ef-446f-917a-549d34ffbf8e-utilities\") pod \"community-operators-vx4r8\" (UID: \"45e6ae52-59ef-446f-917a-549d34ffbf8e\") " pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.579484 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmfv2\" (UniqueName: \"kubernetes.io/projected/45e6ae52-59ef-446f-917a-549d34ffbf8e-kube-api-access-xmfv2\") pod \"community-operators-vx4r8\" (UID: \"45e6ae52-59ef-446f-917a-549d34ffbf8e\") " pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.579826 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45e6ae52-59ef-446f-917a-549d34ffbf8e-catalog-content\") pod \"community-operators-vx4r8\" (UID: \"45e6ae52-59ef-446f-917a-549d34ffbf8e\") " pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.579959 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45e6ae52-59ef-446f-917a-549d34ffbf8e-utilities\") pod \"community-operators-vx4r8\" (UID: \"45e6ae52-59ef-446f-917a-549d34ffbf8e\") " pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.589723 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5tnrx"] Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.590678 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.603524 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5tnrx"] Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.624620 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmfv2\" (UniqueName: \"kubernetes.io/projected/45e6ae52-59ef-446f-917a-549d34ffbf8e-kube-api-access-xmfv2\") pod \"community-operators-vx4r8\" (UID: \"45e6ae52-59ef-446f-917a-549d34ffbf8e\") " pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.680308 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rpl9\" (UniqueName: \"kubernetes.io/projected/6870caea-07d6-4465-86b1-645a2e29b240-kube-api-access-4rpl9\") pod \"certified-operators-5tnrx\" (UID: \"6870caea-07d6-4465-86b1-645a2e29b240\") " pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.680412 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6870caea-07d6-4465-86b1-645a2e29b240-catalog-content\") pod \"certified-operators-5tnrx\" (UID: \"6870caea-07d6-4465-86b1-645a2e29b240\") " pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.680449 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6870caea-07d6-4465-86b1-645a2e29b240-utilities\") pod \"certified-operators-5tnrx\" (UID: \"6870caea-07d6-4465-86b1-645a2e29b240\") " pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.705003 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.705668 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.709373 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.709652 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.709829 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.716918 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.726138 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.781997 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.782314 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9762555c-fc85-46c5-99a4-0b01577780b0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9762555c-fc85-46c5-99a4-0b01577780b0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.782411 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6870caea-07d6-4465-86b1-645a2e29b240-catalog-content\") pod \"certified-operators-5tnrx\" (UID: \"6870caea-07d6-4465-86b1-645a2e29b240\") " pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.782461 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6870caea-07d6-4465-86b1-645a2e29b240-utilities\") pod \"certified-operators-5tnrx\" (UID: \"6870caea-07d6-4465-86b1-645a2e29b240\") " pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.782505 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9762555c-fc85-46c5-99a4-0b01577780b0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9762555c-fc85-46c5-99a4-0b01577780b0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.782535 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rpl9\" (UniqueName: \"kubernetes.io/projected/6870caea-07d6-4465-86b1-645a2e29b240-kube-api-access-4rpl9\") pod \"certified-operators-5tnrx\" (UID: \"6870caea-07d6-4465-86b1-645a2e29b240\") " pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.783236 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6870caea-07d6-4465-86b1-645a2e29b240-catalog-content\") pod \"certified-operators-5tnrx\" (UID: \"6870caea-07d6-4465-86b1-645a2e29b240\") " pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.783258 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6870caea-07d6-4465-86b1-645a2e29b240-utilities\") pod \"certified-operators-5tnrx\" (UID: \"6870caea-07d6-4465-86b1-645a2e29b240\") " pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.811703 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9dbhc"] Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.811904 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" podUID="f8bc408a-bca6-42ff-8572-2ba9a3978682" containerName="controller-manager" containerID="cri-o://0737572e5f80685157a6578fd12aead5fdbe12b0fbb802f48732112a9a3e2ca5" gracePeriod=30 Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.812707 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rpl9\" (UniqueName: \"kubernetes.io/projected/6870caea-07d6-4465-86b1-645a2e29b240-kube-api-access-4rpl9\") pod \"certified-operators-5tnrx\" (UID: \"6870caea-07d6-4465-86b1-645a2e29b240\") " pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.883395 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9762555c-fc85-46c5-99a4-0b01577780b0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9762555c-fc85-46c5-99a4-0b01577780b0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.883538 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9762555c-fc85-46c5-99a4-0b01577780b0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9762555c-fc85-46c5-99a4-0b01577780b0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.883610 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9762555c-fc85-46c5-99a4-0b01577780b0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9762555c-fc85-46c5-99a4-0b01577780b0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.888862 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs"] Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.889054 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" podUID="5a3cabe4-69ee-49f7-a783-e72ac1a56821" containerName="route-controller-manager" containerID="cri-o://0b1af16cc6188236788eb10501019d25c79e6c73c18075a85efbfcfdd6e8d90d" gracePeriod=30 Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.891521 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.910223 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.913015 4837 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-9dbhc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": read tcp 10.217.0.2:45600->10.217.0.7:8443: read: connection reset by peer" start-of-body= Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.913062 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" podUID="f8bc408a-bca6-42ff-8572-2ba9a3978682" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": read tcp 10.217.0.2:45600->10.217.0.7:8443: read: connection reset by peer" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.938155 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9762555c-fc85-46c5-99a4-0b01577780b0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9762555c-fc85-46c5-99a4-0b01577780b0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.976951 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.088727 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.092403 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.212523 4837 generic.go:334] "Generic (PLEG): container finished" podID="f8bc408a-bca6-42ff-8572-2ba9a3978682" containerID="0737572e5f80685157a6578fd12aead5fdbe12b0fbb802f48732112a9a3e2ca5" exitCode=0 Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.212630 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" event={"ID":"f8bc408a-bca6-42ff-8572-2ba9a3978682","Type":"ContainerDied","Data":"0737572e5f80685157a6578fd12aead5fdbe12b0fbb802f48732112a9a3e2ca5"} Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.217520 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-twtbj"] Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.226509 4837 generic.go:334] "Generic (PLEG): container finished" podID="5a3cabe4-69ee-49f7-a783-e72ac1a56821" containerID="0b1af16cc6188236788eb10501019d25c79e6c73c18075a85efbfcfdd6e8d90d" exitCode=0 Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.226586 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" event={"ID":"5a3cabe4-69ee-49f7-a783-e72ac1a56821","Type":"ContainerDied","Data":"0b1af16cc6188236788eb10501019d25c79e6c73c18075a85efbfcfdd6e8d90d"} Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.244262 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-84xjl" event={"ID":"8bc71239-c925-4911-bfa5-e7a564dcd654","Type":"ContainerStarted","Data":"87f1ed8a9c7321b308a794c5373b316a91bed14f1578617af0e948bfd338f284"} Mar 13 11:51:39 crc kubenswrapper[4837]: W0313 11:51:39.255325 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod278c91cc_2624_42cd_a35e_287e22d22f7d.slice/crio-c6c53bda7c5d3c5997c1cf5e6db327e83ff0de4776f4c37d442594e9111862d1 WatchSource:0}: Error finding container c6c53bda7c5d3c5997c1cf5e6db327e83ff0de4776f4c37d442594e9111862d1: Status 404 returned error can't find the container with id c6c53bda7c5d3c5997c1cf5e6db327e83ff0de4776f4c37d442594e9111862d1 Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.492538 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ft6cr"] Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.506747 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:39 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:39 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:39 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.508509 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.574260 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.695903 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a3cabe4-69ee-49f7-a783-e72ac1a56821-config\") pod \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.696354 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj2pk\" (UniqueName: \"kubernetes.io/projected/5a3cabe4-69ee-49f7-a783-e72ac1a56821-kube-api-access-sj2pk\") pod \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.696438 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a3cabe4-69ee-49f7-a783-e72ac1a56821-serving-cert\") pod \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.696469 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a3cabe4-69ee-49f7-a783-e72ac1a56821-client-ca\") pod \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.697333 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a3cabe4-69ee-49f7-a783-e72ac1a56821-client-ca" (OuterVolumeSpecName: "client-ca") pod "5a3cabe4-69ee-49f7-a783-e72ac1a56821" (UID: "5a3cabe4-69ee-49f7-a783-e72ac1a56821"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.697469 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a3cabe4-69ee-49f7-a783-e72ac1a56821-config" (OuterVolumeSpecName: "config") pod "5a3cabe4-69ee-49f7-a783-e72ac1a56821" (UID: "5a3cabe4-69ee-49f7-a783-e72ac1a56821"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.697888 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a3cabe4-69ee-49f7-a783-e72ac1a56821-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.697915 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a3cabe4-69ee-49f7-a783-e72ac1a56821-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.706046 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a3cabe4-69ee-49f7-a783-e72ac1a56821-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5a3cabe4-69ee-49f7-a783-e72ac1a56821" (UID: "5a3cabe4-69ee-49f7-a783-e72ac1a56821"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.706082 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a3cabe4-69ee-49f7-a783-e72ac1a56821-kube-api-access-sj2pk" (OuterVolumeSpecName: "kube-api-access-sj2pk") pod "5a3cabe4-69ee-49f7-a783-e72ac1a56821" (UID: "5a3cabe4-69ee-49f7-a783-e72ac1a56821"). InnerVolumeSpecName "kube-api-access-sj2pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.725385 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 11:51:39 crc kubenswrapper[4837]: W0313 11:51:39.732067 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9762555c_fc85_46c5_99a4_0b01577780b0.slice/crio-6127d8b87e94fb8dd1104b1d5a80edde6a809747eb5ba6dc0602986991a215db WatchSource:0}: Error finding container 6127d8b87e94fb8dd1104b1d5a80edde6a809747eb5ba6dc0602986991a215db: Status 404 returned error can't find the container with id 6127d8b87e94fb8dd1104b1d5a80edde6a809747eb5ba6dc0602986991a215db Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.764600 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.778609 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2w96t"] Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.795874 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vx4r8"] Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.798513 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8bc408a-bca6-42ff-8572-2ba9a3978682-serving-cert\") pod \"f8bc408a-bca6-42ff-8572-2ba9a3978682\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.798653 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvs46\" (UniqueName: \"kubernetes.io/projected/f8bc408a-bca6-42ff-8572-2ba9a3978682-kube-api-access-xvs46\") pod \"f8bc408a-bca6-42ff-8572-2ba9a3978682\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.798741 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-config\") pod \"f8bc408a-bca6-42ff-8572-2ba9a3978682\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.798786 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-client-ca\") pod \"f8bc408a-bca6-42ff-8572-2ba9a3978682\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.798890 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-proxy-ca-bundles\") pod \"f8bc408a-bca6-42ff-8572-2ba9a3978682\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.799186 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj2pk\" (UniqueName: \"kubernetes.io/projected/5a3cabe4-69ee-49f7-a783-e72ac1a56821-kube-api-access-sj2pk\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.799216 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a3cabe4-69ee-49f7-a783-e72ac1a56821-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.801162 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-config" (OuterVolumeSpecName: "config") pod "f8bc408a-bca6-42ff-8572-2ba9a3978682" (UID: "f8bc408a-bca6-42ff-8572-2ba9a3978682"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.801242 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-client-ca" (OuterVolumeSpecName: "client-ca") pod "f8bc408a-bca6-42ff-8572-2ba9a3978682" (UID: "f8bc408a-bca6-42ff-8572-2ba9a3978682"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.801311 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5tnrx"] Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.801726 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f8bc408a-bca6-42ff-8572-2ba9a3978682" (UID: "f8bc408a-bca6-42ff-8572-2ba9a3978682"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.808270 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8bc408a-bca6-42ff-8572-2ba9a3978682-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f8bc408a-bca6-42ff-8572-2ba9a3978682" (UID: "f8bc408a-bca6-42ff-8572-2ba9a3978682"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.810098 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8bc408a-bca6-42ff-8572-2ba9a3978682-kube-api-access-xvs46" (OuterVolumeSpecName: "kube-api-access-xvs46") pod "f8bc408a-bca6-42ff-8572-2ba9a3978682" (UID: "f8bc408a-bca6-42ff-8572-2ba9a3978682"). InnerVolumeSpecName "kube-api-access-xvs46". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.814668 4837 ???:1] "http: TLS handshake error from 192.168.126.11:42470: no serving certificate available for the kubelet" Mar 13 11:51:39 crc kubenswrapper[4837]: W0313 11:51:39.838823 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6870caea_07d6_4465_86b1_645a2e29b240.slice/crio-bfe6aca1334934677df8bf272b7d6fdeb1c785b92dcc8ef7c0566c6636ddfaa3 WatchSource:0}: Error finding container bfe6aca1334934677df8bf272b7d6fdeb1c785b92dcc8ef7c0566c6636ddfaa3: Status 404 returned error can't find the container with id bfe6aca1334934677df8bf272b7d6fdeb1c785b92dcc8ef7c0566c6636ddfaa3 Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.900545 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.901136 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8bc408a-bca6-42ff-8572-2ba9a3978682-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.901152 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvs46\" (UniqueName: \"kubernetes.io/projected/f8bc408a-bca6-42ff-8572-2ba9a3978682-kube-api-access-xvs46\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.901169 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.901182 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.177685 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7crb6"] Mar 13 11:51:40 crc kubenswrapper[4837]: E0313 11:51:40.177944 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a3cabe4-69ee-49f7-a783-e72ac1a56821" containerName="route-controller-manager" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.177959 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a3cabe4-69ee-49f7-a783-e72ac1a56821" containerName="route-controller-manager" Mar 13 11:51:40 crc kubenswrapper[4837]: E0313 11:51:40.177977 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8bc408a-bca6-42ff-8572-2ba9a3978682" containerName="controller-manager" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.177986 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8bc408a-bca6-42ff-8572-2ba9a3978682" containerName="controller-manager" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.178116 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a3cabe4-69ee-49f7-a783-e72ac1a56821" containerName="route-controller-manager" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.178134 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8bc408a-bca6-42ff-8572-2ba9a3978682" containerName="controller-manager" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.178998 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.184484 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.196298 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7crb6"] Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.211104 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfddm\" (UniqueName: \"kubernetes.io/projected/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-kube-api-access-wfddm\") pod \"redhat-marketplace-7crb6\" (UID: \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\") " pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.211181 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-utilities\") pod \"redhat-marketplace-7crb6\" (UID: \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\") " pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.211219 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-catalog-content\") pod \"redhat-marketplace-7crb6\" (UID: \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\") " pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.250282 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9762555c-fc85-46c5-99a4-0b01577780b0","Type":"ContainerStarted","Data":"2af5ddab1d2a04daf9c57e357b7966b38ff85801a9c08e016c2ec482b2f9eb04"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.250335 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9762555c-fc85-46c5-99a4-0b01577780b0","Type":"ContainerStarted","Data":"6127d8b87e94fb8dd1104b1d5a80edde6a809747eb5ba6dc0602986991a215db"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.254038 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" event={"ID":"5a3cabe4-69ee-49f7-a783-e72ac1a56821","Type":"ContainerDied","Data":"2bc8a3d69075e5c30fa5b45ad6a0c6f1944dbbd0064acaad5eadf14dc600adc9"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.254083 4837 scope.go:117] "RemoveContainer" containerID="0b1af16cc6188236788eb10501019d25c79e6c73c18075a85efbfcfdd6e8d90d" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.254183 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.259746 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-84xjl" event={"ID":"8bc71239-c925-4911-bfa5-e7a564dcd654","Type":"ContainerStarted","Data":"3f7712027be97760bdddd9977e9a0c621fce0969c2c77a94b09dcb59e4be8db9"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.262849 4837 generic.go:334] "Generic (PLEG): container finished" podID="831db5b2-5229-4b52-8783-f99c640ba856" containerID="965aad43c7ccd189d4d18246f935c745fc24b5e2cfb5b07896f9492e9109fb55" exitCode=0 Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.263214 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" event={"ID":"831db5b2-5229-4b52-8783-f99c640ba856","Type":"ContainerDied","Data":"965aad43c7ccd189d4d18246f935c745fc24b5e2cfb5b07896f9492e9109fb55"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.265999 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.2659851460000002 podStartE2EDuration="2.265985146s" podCreationTimestamp="2026-03-13 11:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:40.265672056 +0000 UTC m=+215.903938819" watchObservedRunningTime="2026-03-13 11:51:40.265985146 +0000 UTC m=+215.904251909" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.267108 4837 generic.go:334] "Generic (PLEG): container finished" podID="45e6ae52-59ef-446f-917a-549d34ffbf8e" containerID="29b7adb5a9c0b54134cefd4b865e773828049385da6ba275d2791faad9875780" exitCode=0 Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.267182 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vx4r8" event={"ID":"45e6ae52-59ef-446f-917a-549d34ffbf8e","Type":"ContainerDied","Data":"29b7adb5a9c0b54134cefd4b865e773828049385da6ba275d2791faad9875780"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.267211 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vx4r8" event={"ID":"45e6ae52-59ef-446f-917a-549d34ffbf8e","Type":"ContainerStarted","Data":"b8c38b609b1ee957c7e1e1a563341d86aa7368639c49a74a0e6c541c1d320168"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.270413 4837 generic.go:334] "Generic (PLEG): container finished" podID="e6060cf2-077e-4112-af57-f100e297f320" containerID="92a2c4e8c63e772dff31e06b47098a1634c8714cf4402cee4344a939f71bc1a7" exitCode=0 Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.270494 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft6cr" event={"ID":"e6060cf2-077e-4112-af57-f100e297f320","Type":"ContainerDied","Data":"92a2c4e8c63e772dff31e06b47098a1634c8714cf4402cee4344a939f71bc1a7"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.270524 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft6cr" event={"ID":"e6060cf2-077e-4112-af57-f100e297f320","Type":"ContainerStarted","Data":"4b6c9ae51e3fb9c4dadef31697baf0c351e16ed9f865f9be7126242388f9b2dd"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.282973 4837 generic.go:334] "Generic (PLEG): container finished" podID="278c91cc-2624-42cd-a35e-287e22d22f7d" containerID="2e92979b20e46c7135bec8322dc3792b24e0a0cbeac503bda1fa03e0621168f3" exitCode=0 Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.283046 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twtbj" event={"ID":"278c91cc-2624-42cd-a35e-287e22d22f7d","Type":"ContainerDied","Data":"2e92979b20e46c7135bec8322dc3792b24e0a0cbeac503bda1fa03e0621168f3"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.283071 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twtbj" event={"ID":"278c91cc-2624-42cd-a35e-287e22d22f7d","Type":"ContainerStarted","Data":"c6c53bda7c5d3c5997c1cf5e6db327e83ff0de4776f4c37d442594e9111862d1"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.294434 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-84xjl" podStartSLOduration=12.294416627 podStartE2EDuration="12.294416627s" podCreationTimestamp="2026-03-13 11:51:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:40.29388197 +0000 UTC m=+215.932148753" watchObservedRunningTime="2026-03-13 11:51:40.294416627 +0000 UTC m=+215.932683420" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.295318 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" event={"ID":"f8bc408a-bca6-42ff-8572-2ba9a3978682","Type":"ContainerDied","Data":"79d44a31b910bec33360921358068b0857727b0bb4c82bc65255018460fa2174"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.295330 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.301509 4837 scope.go:117] "RemoveContainer" containerID="0737572e5f80685157a6578fd12aead5fdbe12b0fbb802f48732112a9a3e2ca5" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.302033 4837 generic.go:334] "Generic (PLEG): container finished" podID="6870caea-07d6-4465-86b1-645a2e29b240" containerID="ce94a39ad6afbdcb9bafffd4ef0157cee1cf3107185ab6fcc83d0813bc89a94d" exitCode=0 Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.302109 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5tnrx" event={"ID":"6870caea-07d6-4465-86b1-645a2e29b240","Type":"ContainerDied","Data":"ce94a39ad6afbdcb9bafffd4ef0157cee1cf3107185ab6fcc83d0813bc89a94d"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.302142 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5tnrx" event={"ID":"6870caea-07d6-4465-86b1-645a2e29b240","Type":"ContainerStarted","Data":"bfe6aca1334934677df8bf272b7d6fdeb1c785b92dcc8ef7c0566c6636ddfaa3"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.306173 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" event={"ID":"9da9cfd5-f798-42e0-af98-8378cf8d1e5f","Type":"ContainerStarted","Data":"7f6dc77957ef0c3112728bef3166915837fb45018b662ba23f21fb9a5b1d11d9"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.306214 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" event={"ID":"9da9cfd5-f798-42e0-af98-8378cf8d1e5f","Type":"ContainerStarted","Data":"791f2e4e796f079af101ec362853eaa486bb3e46d120e36fdb1c000b9b27a22e"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.306420 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.315971 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-utilities\") pod \"redhat-marketplace-7crb6\" (UID: \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\") " pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.316412 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-catalog-content\") pod \"redhat-marketplace-7crb6\" (UID: \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\") " pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.316499 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfddm\" (UniqueName: \"kubernetes.io/projected/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-kube-api-access-wfddm\") pod \"redhat-marketplace-7crb6\" (UID: \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\") " pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.316660 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-utilities\") pod \"redhat-marketplace-7crb6\" (UID: \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\") " pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.318893 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-catalog-content\") pod \"redhat-marketplace-7crb6\" (UID: \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\") " pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.344823 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfddm\" (UniqueName: \"kubernetes.io/projected/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-kube-api-access-wfddm\") pod \"redhat-marketplace-7crb6\" (UID: \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\") " pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.353651 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs"] Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.356055 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs"] Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.373543 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" podStartSLOduration=158.373523393 podStartE2EDuration="2m38.373523393s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:40.371419988 +0000 UTC m=+216.009686771" watchObservedRunningTime="2026-03-13 11:51:40.373523393 +0000 UTC m=+216.011790156" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.457789 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9dbhc"] Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.465413 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9dbhc"] Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.500777 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:40 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:40 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:40 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.500875 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.501592 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-cf76c7dc-qtd9h"] Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.502886 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.505018 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.505693 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.505783 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9"] Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.505836 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.506278 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.506604 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.507027 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.507279 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.515309 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.515802 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.516000 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.515802 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.516691 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.518727 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.522384 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.523143 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-serving-cert\") pod \"route-controller-manager-6b69f575c8-6gmv9\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.524406 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-proxy-ca-bundles\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.524577 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7km6k\" (UniqueName: \"kubernetes.io/projected/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-kube-api-access-7km6k\") pod \"route-controller-manager-6b69f575c8-6gmv9\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.524776 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2mns\" (UniqueName: \"kubernetes.io/projected/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-kube-api-access-x2mns\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.525025 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-config\") pod \"route-controller-manager-6b69f575c8-6gmv9\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.525163 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-client-ca\") pod \"route-controller-manager-6b69f575c8-6gmv9\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.524287 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.525405 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cf76c7dc-qtd9h"] Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.527356 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-client-ca\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.527586 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-config\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.527715 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-serving-cert\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.540609 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9"] Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.571172 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jspgm"] Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.572129 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.591448 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jspgm"] Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.629498 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-serving-cert\") pod \"route-controller-manager-6b69f575c8-6gmv9\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.629554 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-utilities\") pod \"redhat-marketplace-jspgm\" (UID: \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\") " pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.629589 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-proxy-ca-bundles\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.629609 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7km6k\" (UniqueName: \"kubernetes.io/projected/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-kube-api-access-7km6k\") pod \"route-controller-manager-6b69f575c8-6gmv9\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.630568 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2mns\" (UniqueName: \"kubernetes.io/projected/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-kube-api-access-x2mns\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.630627 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-config\") pod \"route-controller-manager-6b69f575c8-6gmv9\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.630676 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-client-ca\") pod \"route-controller-manager-6b69f575c8-6gmv9\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.630726 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l42fc\" (UniqueName: \"kubernetes.io/projected/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-kube-api-access-l42fc\") pod \"redhat-marketplace-jspgm\" (UID: \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\") " pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.630778 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-client-ca\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.630824 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-config\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.630856 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-serving-cert\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.640292 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-serving-cert\") pod \"route-controller-manager-6b69f575c8-6gmv9\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.643205 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-proxy-ca-bundles\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.644028 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-config\") pod \"route-controller-manager-6b69f575c8-6gmv9\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.644696 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-client-ca\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.644729 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-client-ca\") pod \"route-controller-manager-6b69f575c8-6gmv9\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.644792 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-catalog-content\") pod \"redhat-marketplace-jspgm\" (UID: \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\") " pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.645830 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-config\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.650449 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-serving-cert\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.664389 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7km6k\" (UniqueName: \"kubernetes.io/projected/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-kube-api-access-7km6k\") pod \"route-controller-manager-6b69f575c8-6gmv9\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.668720 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2mns\" (UniqueName: \"kubernetes.io/projected/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-kube-api-access-x2mns\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.747089 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-catalog-content\") pod \"redhat-marketplace-jspgm\" (UID: \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\") " pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.747185 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-utilities\") pod \"redhat-marketplace-jspgm\" (UID: \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\") " pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.747320 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l42fc\" (UniqueName: \"kubernetes.io/projected/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-kube-api-access-l42fc\") pod \"redhat-marketplace-jspgm\" (UID: \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\") " pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.748557 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-catalog-content\") pod \"redhat-marketplace-jspgm\" (UID: \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\") " pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.749403 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-utilities\") pod \"redhat-marketplace-jspgm\" (UID: \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\") " pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.771593 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7crb6"] Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.779806 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l42fc\" (UniqueName: \"kubernetes.io/projected/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-kube-api-access-l42fc\") pod \"redhat-marketplace-jspgm\" (UID: \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\") " pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.829894 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.846422 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.893881 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.025428 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.029241 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.036370 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.044934 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.049214 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.082276 4837 patch_prober.go:28] interesting pod/console-f9d7485db-q2qpt container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.082328 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-q2qpt" podUID="c83842ec-9933-4f84-bb4a-c84ca61a28e1" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.097397 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a3cabe4-69ee-49f7-a783-e72ac1a56821" path="/var/lib/kubelet/pods/5a3cabe4-69ee-49f7-a783-e72ac1a56821/volumes" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.098440 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8bc408a-bca6-42ff-8572-2ba9a3978682" path="/var/lib/kubelet/pods/f8bc408a-bca6-42ff-8572-2ba9a3978682/volumes" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.099125 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.099170 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.099184 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.099294 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.099314 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.108311 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.128987 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.135412 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.145305 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.153031 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b3036ca-4a0e-45d9-9c51-c3faa6067ce3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.153124 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b3036ca-4a0e-45d9-9c51-c3faa6067ce3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.210943 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ng6kk"] Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.212169 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.215983 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.254673 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-catalog-content\") pod \"redhat-operators-ng6kk\" (UID: \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\") " pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.255839 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b3036ca-4a0e-45d9-9c51-c3faa6067ce3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.255880 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-utilities\") pod \"redhat-operators-ng6kk\" (UID: \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\") " pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.255901 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpclz\" (UniqueName: \"kubernetes.io/projected/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-kube-api-access-vpclz\") pod \"redhat-operators-ng6kk\" (UID: \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\") " pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.255927 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b3036ca-4a0e-45d9-9c51-c3faa6067ce3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.256047 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b3036ca-4a0e-45d9-9c51-c3faa6067ce3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.281382 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ng6kk"] Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.282824 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.295814 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b3036ca-4a0e-45d9-9c51-c3faa6067ce3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.304881 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.335057 4837 generic.go:334] "Generic (PLEG): container finished" podID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" containerID="86b2c7193d237a632c12a22d24c63b2f2247ec49e6eadaf1bcddddc01304e298" exitCode=0 Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.335115 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7crb6" event={"ID":"080747b0-3d43-4ff1-b21c-b8ea9fc2f961","Type":"ContainerDied","Data":"86b2c7193d237a632c12a22d24c63b2f2247ec49e6eadaf1bcddddc01304e298"} Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.335142 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7crb6" event={"ID":"080747b0-3d43-4ff1-b21c-b8ea9fc2f961","Type":"ContainerStarted","Data":"3672e1f233b40bf42b048214c1fa7e9647f6025a8a0466aed9482e60a925fb22"} Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.340361 4837 generic.go:334] "Generic (PLEG): container finished" podID="9762555c-fc85-46c5-99a4-0b01577780b0" containerID="2af5ddab1d2a04daf9c57e357b7966b38ff85801a9c08e016c2ec482b2f9eb04" exitCode=0 Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.340410 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9762555c-fc85-46c5-99a4-0b01577780b0","Type":"ContainerDied","Data":"2af5ddab1d2a04daf9c57e357b7966b38ff85801a9c08e016c2ec482b2f9eb04"} Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.343944 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9"] Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.357719 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-utilities\") pod \"redhat-operators-ng6kk\" (UID: \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\") " pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.357776 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpclz\" (UniqueName: \"kubernetes.io/projected/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-kube-api-access-vpclz\") pod \"redhat-operators-ng6kk\" (UID: \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\") " pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.357934 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-catalog-content\") pod \"redhat-operators-ng6kk\" (UID: \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\") " pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.359460 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-utilities\") pod \"redhat-operators-ng6kk\" (UID: \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\") " pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.360054 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-catalog-content\") pod \"redhat-operators-ng6kk\" (UID: \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\") " pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.386941 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.398765 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cf76c7dc-qtd9h"] Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.400527 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpclz\" (UniqueName: \"kubernetes.io/projected/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-kube-api-access-vpclz\") pod \"redhat-operators-ng6kk\" (UID: \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\") " pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.408287 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.495732 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.497873 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:41 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:41 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:41 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.497929 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.588464 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.607317 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j246z"] Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.608351 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.613011 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j246z"] Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.653201 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jspgm"] Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.763652 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-catalog-content\") pod \"redhat-operators-j246z\" (UID: \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\") " pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.763725 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-utilities\") pod \"redhat-operators-j246z\" (UID: \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\") " pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.763750 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx4zq\" (UniqueName: \"kubernetes.io/projected/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-kube-api-access-xx4zq\") pod \"redhat-operators-j246z\" (UID: \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\") " pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.865360 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-utilities\") pod \"redhat-operators-j246z\" (UID: \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\") " pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.865833 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx4zq\" (UniqueName: \"kubernetes.io/projected/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-kube-api-access-xx4zq\") pod \"redhat-operators-j246z\" (UID: \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\") " pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.865949 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-catalog-content\") pod \"redhat-operators-j246z\" (UID: \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\") " pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.866440 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-catalog-content\") pod \"redhat-operators-j246z\" (UID: \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\") " pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.866720 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-utilities\") pod \"redhat-operators-j246z\" (UID: \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\") " pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.871151 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.910267 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx4zq\" (UniqueName: \"kubernetes.io/projected/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-kube-api-access-xx4zq\") pod \"redhat-operators-j246z\" (UID: \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\") " pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.966752 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/831db5b2-5229-4b52-8783-f99c640ba856-config-volume\") pod \"831db5b2-5229-4b52-8783-f99c640ba856\" (UID: \"831db5b2-5229-4b52-8783-f99c640ba856\") " Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.966822 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl4f8\" (UniqueName: \"kubernetes.io/projected/831db5b2-5229-4b52-8783-f99c640ba856-kube-api-access-pl4f8\") pod \"831db5b2-5229-4b52-8783-f99c640ba856\" (UID: \"831db5b2-5229-4b52-8783-f99c640ba856\") " Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.966857 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/831db5b2-5229-4b52-8783-f99c640ba856-secret-volume\") pod \"831db5b2-5229-4b52-8783-f99c640ba856\" (UID: \"831db5b2-5229-4b52-8783-f99c640ba856\") " Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.968993 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/831db5b2-5229-4b52-8783-f99c640ba856-config-volume" (OuterVolumeSpecName: "config-volume") pod "831db5b2-5229-4b52-8783-f99c640ba856" (UID: "831db5b2-5229-4b52-8783-f99c640ba856"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.973377 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/831db5b2-5229-4b52-8783-f99c640ba856-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "831db5b2-5229-4b52-8783-f99c640ba856" (UID: "831db5b2-5229-4b52-8783-f99c640ba856"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.975712 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/831db5b2-5229-4b52-8783-f99c640ba856-kube-api-access-pl4f8" (OuterVolumeSpecName: "kube-api-access-pl4f8") pod "831db5b2-5229-4b52-8783-f99c640ba856" (UID: "831db5b2-5229-4b52-8783-f99c640ba856"). InnerVolumeSpecName "kube-api-access-pl4f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.979076 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:41.999452 4837 patch_prober.go:28] interesting pod/downloads-7954f5f757-8ktsx container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:41.999724 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8ktsx" podUID="85ac6950-8b98-4d0c-8a2b-7eeeac8d1435" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:41.999959 4837 patch_prober.go:28] interesting pod/downloads-7954f5f757-8ktsx container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.000059 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8ktsx" podUID="85ac6950-8b98-4d0c-8a2b-7eeeac8d1435" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.069780 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/831db5b2-5229-4b52-8783-f99c640ba856-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.070303 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl4f8\" (UniqueName: \"kubernetes.io/projected/831db5b2-5229-4b52-8783-f99c640ba856-kube-api-access-pl4f8\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.070319 4837 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/831db5b2-5229-4b52-8783-f99c640ba856-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.085203 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.177365 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.179568 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.369288 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ng6kk"] Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.409003 4837 ???:1] "http: TLS handshake error from 192.168.126.11:42486: no serving certificate available for the kubelet" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.447000 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" event={"ID":"12e5f732-00c7-49ae-9e3e-121aa7caa6ee","Type":"ContainerStarted","Data":"b82fa6f2134589dee51636289f2f8c0ff8d4c77d04a184b0382c40aa9a2b8bdc"} Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.447048 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" event={"ID":"12e5f732-00c7-49ae-9e3e-121aa7caa6ee","Type":"ContainerStarted","Data":"351e128a579d7bea389593621f8531499b1484f659ddbed7034ee720b2bb6945"} Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.450275 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.474159 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.477929 4837 generic.go:334] "Generic (PLEG): container finished" podID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" containerID="23cf1d25c2a824231d5fd39eb5a9920394e587e257254f6ffa6bf893fe9f2624" exitCode=0 Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.478014 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jspgm" event={"ID":"5236ae0e-b305-4f1c-9125-bbac1eeb07f3","Type":"ContainerDied","Data":"23cf1d25c2a824231d5fd39eb5a9920394e587e257254f6ffa6bf893fe9f2624"} Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.478041 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jspgm" event={"ID":"5236ae0e-b305-4f1c-9125-bbac1eeb07f3","Type":"ContainerStarted","Data":"307f294c9c816d0f8c581cbf3561f2a5e0cff01395517438e2ad320ce61f35e4"} Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.492369 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3","Type":"ContainerStarted","Data":"68c9618069477f1223e3ecc0d2dec6041262b1cf792bbe2a6cc6e41865aea27a"} Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.492720 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" podStartSLOduration=3.4927026469999998 podStartE2EDuration="3.492702647s" podCreationTimestamp="2026-03-13 11:51:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:42.490681454 +0000 UTC m=+218.128948227" watchObservedRunningTime="2026-03-13 11:51:42.492702647 +0000 UTC m=+218.130969410" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.499308 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:42 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:42 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:42 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.499633 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.505271 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" event={"ID":"831db5b2-5229-4b52-8783-f99c640ba856","Type":"ContainerDied","Data":"3c294e8fd793300ee1caaf9d2068d0b14e0a9dc058385ad90bb33a7237e0e283"} Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.505307 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c294e8fd793300ee1caaf9d2068d0b14e0a9dc058385ad90bb33a7237e0e283" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.505379 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.520000 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" event={"ID":"a3c9b59a-0eeb-49e0-86ef-30222e5926aa","Type":"ContainerStarted","Data":"04c1b4cdc66c99fd06f47d8f53d5a9118d0695a5ac3f712471886566797fee43"} Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.520127 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" event={"ID":"a3c9b59a-0eeb-49e0-86ef-30222e5926aa","Type":"ContainerStarted","Data":"d7a58daa62b7a3f44dc2a8d87fb35984d20d34c979da845c5833bab0d1c0d7f2"} Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.526782 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.541797 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.597940 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" podStartSLOduration=3.597915461 podStartE2EDuration="3.597915461s" podCreationTimestamp="2026-03-13 11:51:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:42.585046458 +0000 UTC m=+218.223313211" watchObservedRunningTime="2026-03-13 11:51:42.597915461 +0000 UTC m=+218.236182224" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.983052 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j246z"] Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.996870 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:51:42 crc kubenswrapper[4837]: W0313 11:51:42.998709 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32a36cbe_a17f_46bf_9c6a_1df6f427e2c6.slice/crio-524d259feb76e4121fddc10b32a9829c69c7b137ab82d2d2c18f81ea9d556b60 WatchSource:0}: Error finding container 524d259feb76e4121fddc10b32a9829c69c7b137ab82d2d2c18f81ea9d556b60: Status 404 returned error can't find the container with id 524d259feb76e4121fddc10b32a9829c69c7b137ab82d2d2c18f81ea9d556b60 Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.112185 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9762555c-fc85-46c5-99a4-0b01577780b0-kubelet-dir\") pod \"9762555c-fc85-46c5-99a4-0b01577780b0\" (UID: \"9762555c-fc85-46c5-99a4-0b01577780b0\") " Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.113114 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9762555c-fc85-46c5-99a4-0b01577780b0-kube-api-access\") pod \"9762555c-fc85-46c5-99a4-0b01577780b0\" (UID: \"9762555c-fc85-46c5-99a4-0b01577780b0\") " Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.113304 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.113336 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.113363 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.113408 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.114396 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9762555c-fc85-46c5-99a4-0b01577780b0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9762555c-fc85-46c5-99a4-0b01577780b0" (UID: "9762555c-fc85-46c5-99a4-0b01577780b0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.116737 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.128118 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.129287 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.129518 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.129727 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9762555c-fc85-46c5-99a4-0b01577780b0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9762555c-fc85-46c5-99a4-0b01577780b0" (UID: "9762555c-fc85-46c5-99a4-0b01577780b0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.214408 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.214509 4837 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9762555c-fc85-46c5-99a4-0b01577780b0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.214524 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9762555c-fc85-46c5-99a4-0b01577780b0-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.220660 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.260806 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.271646 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.279400 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.471265 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.494979 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:43 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:43 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:43 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.495213 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.544777 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3","Type":"ContainerStarted","Data":"5f2332ff2aa4ac65770f7fb36b9b44babbbf5ea1ede6559d7a38c358e4838de4"} Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.565911 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.565890414 podStartE2EDuration="2.565890414s" podCreationTimestamp="2026-03-13 11:51:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:43.565211424 +0000 UTC m=+219.203478197" watchObservedRunningTime="2026-03-13 11:51:43.565890414 +0000 UTC m=+219.204157187" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.570503 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9762555c-fc85-46c5-99a4-0b01577780b0","Type":"ContainerDied","Data":"6127d8b87e94fb8dd1104b1d5a80edde6a809747eb5ba6dc0602986991a215db"} Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.570543 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6127d8b87e94fb8dd1104b1d5a80edde6a809747eb5ba6dc0602986991a215db" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.570598 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.582021 4837 generic.go:334] "Generic (PLEG): container finished" podID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" containerID="613107c1ce24dcf9cb1cf0c1623f3de9a7d5b33bc09c57a646911cae7011d82e" exitCode=0 Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.582108 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j246z" event={"ID":"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6","Type":"ContainerDied","Data":"613107c1ce24dcf9cb1cf0c1623f3de9a7d5b33bc09c57a646911cae7011d82e"} Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.582137 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j246z" event={"ID":"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6","Type":"ContainerStarted","Data":"524d259feb76e4121fddc10b32a9829c69c7b137ab82d2d2c18f81ea9d556b60"} Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.590520 4837 generic.go:334] "Generic (PLEG): container finished" podID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" containerID="96174c590656df138c1d79af4e8416815fc220b78535e35ba951e58fb70ac305" exitCode=0 Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.590674 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng6kk" event={"ID":"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d","Type":"ContainerDied","Data":"96174c590656df138c1d79af4e8416815fc220b78535e35ba951e58fb70ac305"} Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.590720 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng6kk" event={"ID":"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d","Type":"ContainerStarted","Data":"eac45e620e44e693cbb55f704b7783d81f0f024e3e2cf4051be3383dc9b6b145"} Mar 13 11:51:43 crc kubenswrapper[4837]: W0313 11:51:43.959269 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-9a1a9acd8c95fe633e0ad75eed184ecded06d9cd1da532cd24c0c79c70ede582 WatchSource:0}: Error finding container 9a1a9acd8c95fe633e0ad75eed184ecded06d9cd1da532cd24c0c79c70ede582: Status 404 returned error can't find the container with id 9a1a9acd8c95fe633e0ad75eed184ecded06d9cd1da532cd24c0c79c70ede582 Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.049709 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cjn4q"] Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.290256 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.498889 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:44 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:44 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:44 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.498966 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.622785 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" event={"ID":"86e5afeb-4720-4593-a53e-dfb5381d0b1d","Type":"ContainerStarted","Data":"5cf98a5b729f9333f1f80a59486dc7faa7bcb28e5a0ff758d9fc65192d2b963f"} Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.629241 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"470c532726880d1e6f64d1b2504c3040447e21a8badc2ff8ff10632e0dfad3b7"} Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.631185 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5c62d412c0b98557eabe844fb8b508b384241ec442c29112b6bcb46aecab33af"} Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.659682 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f92ebade66e7aabb858b5a2cd9e46c26aa00174bf8ba4e8fbf822142b02c3cba"} Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.659754 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9a1a9acd8c95fe633e0ad75eed184ecded06d9cd1da532cd24c0c79c70ede582"} Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.670147 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7bc6ebb4cef7add0cd370a574f88318c57908982e7687b876dd4f22f8dba508e"} Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.670205 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"a4d28a7b7e7e955e2a6a9bca6a2142d04feb7e810a4e97116b1e3c960df2ae1f"} Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.671111 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.673531 4837 generic.go:334] "Generic (PLEG): container finished" podID="5b3036ca-4a0e-45d9-9c51-c3faa6067ce3" containerID="5f2332ff2aa4ac65770f7fb36b9b44babbbf5ea1ede6559d7a38c358e4838de4" exitCode=0 Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.673916 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3","Type":"ContainerDied","Data":"5f2332ff2aa4ac65770f7fb36b9b44babbbf5ea1ede6559d7a38c358e4838de4"} Mar 13 11:51:45 crc kubenswrapper[4837]: I0313 11:51:45.493374 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:45 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:45 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:45 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:45 crc kubenswrapper[4837]: I0313 11:51:45.493441 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:45 crc kubenswrapper[4837]: I0313 11:51:45.682902 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" event={"ID":"86e5afeb-4720-4593-a53e-dfb5381d0b1d","Type":"ContainerStarted","Data":"e1ea74358708fea67b17abc9bede9d292fe5bf85f5d2f9c7ae44e817c98fa621"} Mar 13 11:51:45 crc kubenswrapper[4837]: I0313 11:51:45.903630 4837 ???:1] "http: TLS handshake error from 192.168.126.11:52396: no serving certificate available for the kubelet" Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.116253 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.190353 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b3036ca-4a0e-45d9-9c51-c3faa6067ce3-kube-api-access\") pod \"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3\" (UID: \"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3\") " Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.190472 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b3036ca-4a0e-45d9-9c51-c3faa6067ce3-kubelet-dir\") pod \"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3\" (UID: \"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3\") " Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.190801 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5b3036ca-4a0e-45d9-9c51-c3faa6067ce3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5b3036ca-4a0e-45d9-9c51-c3faa6067ce3" (UID: "5b3036ca-4a0e-45d9-9c51-c3faa6067ce3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.204772 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b3036ca-4a0e-45d9-9c51-c3faa6067ce3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5b3036ca-4a0e-45d9-9c51-c3faa6067ce3" (UID: "5b3036ca-4a0e-45d9-9c51-c3faa6067ce3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.291826 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b3036ca-4a0e-45d9-9c51-c3faa6067ce3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.291857 4837 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b3036ca-4a0e-45d9-9c51-c3faa6067ce3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.493700 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:46 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:46 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:46 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.493774 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.753203 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.753332 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3","Type":"ContainerDied","Data":"68c9618069477f1223e3ecc0d2dec6041262b1cf792bbe2a6cc6e41865aea27a"} Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.753374 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68c9618069477f1223e3ecc0d2dec6041262b1cf792bbe2a6cc6e41865aea27a" Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.775894 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" event={"ID":"86e5afeb-4720-4593-a53e-dfb5381d0b1d","Type":"ContainerStarted","Data":"be923e781809c02a28a4c9369907b19014d2a115043ade3ff095562f00fa19e4"} Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.803027 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-cjn4q" podStartSLOduration=164.803004427 podStartE2EDuration="2m44.803004427s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:46.802358927 +0000 UTC m=+222.440625690" watchObservedRunningTime="2026-03-13 11:51:46.803004427 +0000 UTC m=+222.441271190" Mar 13 11:51:47 crc kubenswrapper[4837]: I0313 11:51:47.493131 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:47 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:47 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:47 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:47 crc kubenswrapper[4837]: I0313 11:51:47.493205 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:47 crc kubenswrapper[4837]: I0313 11:51:47.549811 4837 ???:1] "http: TLS handshake error from 192.168.126.11:52398: no serving certificate available for the kubelet" Mar 13 11:51:48 crc kubenswrapper[4837]: I0313 11:51:48.493707 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:48 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:48 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:48 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:48 crc kubenswrapper[4837]: I0313 11:51:48.493823 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:49 crc kubenswrapper[4837]: I0313 11:51:49.492130 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:49 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:49 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:49 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:49 crc kubenswrapper[4837]: I0313 11:51:49.492221 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:50 crc kubenswrapper[4837]: I0313 11:51:50.497793 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:50 crc kubenswrapper[4837]: [+]has-synced ok Mar 13 11:51:50 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:50 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:50 crc kubenswrapper[4837]: I0313 11:51:50.497871 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:51 crc kubenswrapper[4837]: I0313 11:51:51.074345 4837 patch_prober.go:28] interesting pod/console-f9d7485db-q2qpt container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 13 11:51:51 crc kubenswrapper[4837]: I0313 11:51:51.074699 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-q2qpt" podUID="c83842ec-9933-4f84-bb4a-c84ca61a28e1" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 13 11:51:51 crc kubenswrapper[4837]: I0313 11:51:51.493298 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:51 crc kubenswrapper[4837]: I0313 11:51:51.495945 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:52 crc kubenswrapper[4837]: I0313 11:51:52.004867 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-8ktsx" Mar 13 11:51:58 crc kubenswrapper[4837]: I0313 11:51:58.102486 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-cf76c7dc-qtd9h"] Mar 13 11:51:58 crc kubenswrapper[4837]: I0313 11:51:58.103164 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" podUID="a3c9b59a-0eeb-49e0-86ef-30222e5926aa" containerName="controller-manager" containerID="cri-o://04c1b4cdc66c99fd06f47d8f53d5a9118d0695a5ac3f712471886566797fee43" gracePeriod=30 Mar 13 11:51:58 crc kubenswrapper[4837]: I0313 11:51:58.122045 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9"] Mar 13 11:51:58 crc kubenswrapper[4837]: I0313 11:51:58.122311 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" podUID="12e5f732-00c7-49ae-9e3e-121aa7caa6ee" containerName="route-controller-manager" containerID="cri-o://b82fa6f2134589dee51636289f2f8c0ff8d4c77d04a184b0382c40aa9a2b8bdc" gracePeriod=30 Mar 13 11:51:58 crc kubenswrapper[4837]: I0313 11:51:58.918160 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.137715 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556712-g8877"] Mar 13 11:52:00 crc kubenswrapper[4837]: E0313 11:52:00.138112 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b3036ca-4a0e-45d9-9c51-c3faa6067ce3" containerName="pruner" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.138134 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3036ca-4a0e-45d9-9c51-c3faa6067ce3" containerName="pruner" Mar 13 11:52:00 crc kubenswrapper[4837]: E0313 11:52:00.138153 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="831db5b2-5229-4b52-8783-f99c640ba856" containerName="collect-profiles" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.138162 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="831db5b2-5229-4b52-8783-f99c640ba856" containerName="collect-profiles" Mar 13 11:52:00 crc kubenswrapper[4837]: E0313 11:52:00.138183 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9762555c-fc85-46c5-99a4-0b01577780b0" containerName="pruner" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.138193 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9762555c-fc85-46c5-99a4-0b01577780b0" containerName="pruner" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.138413 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="831db5b2-5229-4b52-8783-f99c640ba856" containerName="collect-profiles" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.138444 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b3036ca-4a0e-45d9-9c51-c3faa6067ce3" containerName="pruner" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.138460 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9762555c-fc85-46c5-99a4-0b01577780b0" containerName="pruner" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.139214 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556712-g8877" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.141794 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.146772 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556712-g8877"] Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.179078 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xqlb\" (UniqueName: \"kubernetes.io/projected/87edec8a-33b2-44c0-bbcb-1e4f5dded1b2-kube-api-access-9xqlb\") pod \"auto-csr-approver-29556712-g8877\" (UID: \"87edec8a-33b2-44c0-bbcb-1e4f5dded1b2\") " pod="openshift-infra/auto-csr-approver-29556712-g8877" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.279798 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xqlb\" (UniqueName: \"kubernetes.io/projected/87edec8a-33b2-44c0-bbcb-1e4f5dded1b2-kube-api-access-9xqlb\") pod \"auto-csr-approver-29556712-g8877\" (UID: \"87edec8a-33b2-44c0-bbcb-1e4f5dded1b2\") " pod="openshift-infra/auto-csr-approver-29556712-g8877" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.298383 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xqlb\" (UniqueName: \"kubernetes.io/projected/87edec8a-33b2-44c0-bbcb-1e4f5dded1b2-kube-api-access-9xqlb\") pod \"auto-csr-approver-29556712-g8877\" (UID: \"87edec8a-33b2-44c0-bbcb-1e4f5dded1b2\") " pod="openshift-infra/auto-csr-approver-29556712-g8877" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.581071 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556712-g8877" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.831320 4837 patch_prober.go:28] interesting pod/controller-manager-cf76c7dc-qtd9h container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.831407 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" podUID="a3c9b59a-0eeb-49e0-86ef-30222e5926aa" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.848550 4837 patch_prober.go:28] interesting pod/route-controller-manager-6b69f575c8-6gmv9 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.848608 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" podUID="12e5f732-00c7-49ae-9e3e-121aa7caa6ee" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.884278 4837 generic.go:334] "Generic (PLEG): container finished" podID="12e5f732-00c7-49ae-9e3e-121aa7caa6ee" containerID="b82fa6f2134589dee51636289f2f8c0ff8d4c77d04a184b0382c40aa9a2b8bdc" exitCode=0 Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.884349 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" event={"ID":"12e5f732-00c7-49ae-9e3e-121aa7caa6ee","Type":"ContainerDied","Data":"b82fa6f2134589dee51636289f2f8c0ff8d4c77d04a184b0382c40aa9a2b8bdc"} Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.886325 4837 generic.go:334] "Generic (PLEG): container finished" podID="a3c9b59a-0eeb-49e0-86ef-30222e5926aa" containerID="04c1b4cdc66c99fd06f47d8f53d5a9118d0695a5ac3f712471886566797fee43" exitCode=0 Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.886348 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" event={"ID":"a3c9b59a-0eeb-49e0-86ef-30222e5926aa","Type":"ContainerDied","Data":"04c1b4cdc66c99fd06f47d8f53d5a9118d0695a5ac3f712471886566797fee43"} Mar 13 11:52:01 crc kubenswrapper[4837]: I0313 11:52:01.078079 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:52:01 crc kubenswrapper[4837]: I0313 11:52:01.082183 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:52:05 crc kubenswrapper[4837]: I0313 11:52:05.484227 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:52:05 crc kubenswrapper[4837]: I0313 11:52:05.484597 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:52:08 crc kubenswrapper[4837]: I0313 11:52:08.053692 4837 ???:1] "http: TLS handshake error from 192.168.126.11:39490: no serving certificate available for the kubelet" Mar 13 11:52:11 crc kubenswrapper[4837]: I0313 11:52:11.534400 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" Mar 13 11:52:11 crc kubenswrapper[4837]: I0313 11:52:11.832437 4837 patch_prober.go:28] interesting pod/controller-manager-cf76c7dc-qtd9h container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:52:11 crc kubenswrapper[4837]: I0313 11:52:11.832731 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" podUID="a3c9b59a-0eeb-49e0-86ef-30222e5926aa" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:52:11 crc kubenswrapper[4837]: I0313 11:52:11.848793 4837 patch_prober.go:28] interesting pod/route-controller-manager-6b69f575c8-6gmv9 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:52:11 crc kubenswrapper[4837]: I0313 11:52:11.848876 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" podUID="12e5f732-00c7-49ae-9e3e-121aa7caa6ee" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:52:12 crc kubenswrapper[4837]: E0313 11:52:12.957820 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 13 11:52:12 crc kubenswrapper[4837]: E0313 11:52:12.958238 4837 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 11:52:12 crc kubenswrapper[4837]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 13 11:52:12 crc kubenswrapper[4837]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dlwdw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29556710-lcprh_openshift-infra(0484d991-f239-47a2-80ff-0237945c27ac): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 13 11:52:12 crc kubenswrapper[4837]: > logger="UnhandledError" Mar 13 11:52:12 crc kubenswrapper[4837]: E0313 11:52:12.960005 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29556710-lcprh" podUID="0484d991-f239-47a2-80ff-0237945c27ac" Mar 13 11:52:12 crc kubenswrapper[4837]: I0313 11:52:12.969271 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:52:12 crc kubenswrapper[4837]: I0313 11:52:12.972950 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:12.999324 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6dfc58dd94-n92qv"] Mar 13 11:52:13 crc kubenswrapper[4837]: E0313 11:52:13.000029 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12e5f732-00c7-49ae-9e3e-121aa7caa6ee" containerName="route-controller-manager" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.000042 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="12e5f732-00c7-49ae-9e3e-121aa7caa6ee" containerName="route-controller-manager" Mar 13 11:52:13 crc kubenswrapper[4837]: E0313 11:52:13.000053 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c9b59a-0eeb-49e0-86ef-30222e5926aa" containerName="controller-manager" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.000059 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c9b59a-0eeb-49e0-86ef-30222e5926aa" containerName="controller-manager" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.000156 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c9b59a-0eeb-49e0-86ef-30222e5926aa" containerName="controller-manager" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.000165 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="12e5f732-00c7-49ae-9e3e-121aa7caa6ee" containerName="route-controller-manager" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.000501 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.013974 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dfc58dd94-n92qv"] Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.040618 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-serving-cert\") pod \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.040698 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-config\") pod \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.040931 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-config\") pod \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.040963 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2mns\" (UniqueName: \"kubernetes.io/projected/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-kube-api-access-x2mns\") pod \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.041002 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7km6k\" (UniqueName: \"kubernetes.io/projected/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-kube-api-access-7km6k\") pod \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.041029 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-proxy-ca-bundles\") pod \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.041043 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-client-ca\") pod \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.041084 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-client-ca\") pod \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.041104 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-serving-cert\") pod \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.041223 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-proxy-ca-bundles\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.041258 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4k7j\" (UniqueName: \"kubernetes.io/projected/9af86f4e-8143-426d-98a6-b59bde2a6247-kube-api-access-q4k7j\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.041284 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-config\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.041311 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-client-ca\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.041326 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9af86f4e-8143-426d-98a6-b59bde2a6247-serving-cert\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.041766 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-config" (OuterVolumeSpecName: "config") pod "a3c9b59a-0eeb-49e0-86ef-30222e5926aa" (UID: "a3c9b59a-0eeb-49e0-86ef-30222e5926aa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.041793 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-config" (OuterVolumeSpecName: "config") pod "12e5f732-00c7-49ae-9e3e-121aa7caa6ee" (UID: "12e5f732-00c7-49ae-9e3e-121aa7caa6ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.042340 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-client-ca" (OuterVolumeSpecName: "client-ca") pod "12e5f732-00c7-49ae-9e3e-121aa7caa6ee" (UID: "12e5f732-00c7-49ae-9e3e-121aa7caa6ee"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.042532 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a3c9b59a-0eeb-49e0-86ef-30222e5926aa" (UID: "a3c9b59a-0eeb-49e0-86ef-30222e5926aa"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.043362 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-client-ca" (OuterVolumeSpecName: "client-ca") pod "a3c9b59a-0eeb-49e0-86ef-30222e5926aa" (UID: "a3c9b59a-0eeb-49e0-86ef-30222e5926aa"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.046033 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "12e5f732-00c7-49ae-9e3e-121aa7caa6ee" (UID: "12e5f732-00c7-49ae-9e3e-121aa7caa6ee"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.047043 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-kube-api-access-7km6k" (OuterVolumeSpecName: "kube-api-access-7km6k") pod "12e5f732-00c7-49ae-9e3e-121aa7caa6ee" (UID: "12e5f732-00c7-49ae-9e3e-121aa7caa6ee"). InnerVolumeSpecName "kube-api-access-7km6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.047149 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-kube-api-access-x2mns" (OuterVolumeSpecName: "kube-api-access-x2mns") pod "a3c9b59a-0eeb-49e0-86ef-30222e5926aa" (UID: "a3c9b59a-0eeb-49e0-86ef-30222e5926aa"). InnerVolumeSpecName "kube-api-access-x2mns". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.061117 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a3c9b59a-0eeb-49e0-86ef-30222e5926aa" (UID: "a3c9b59a-0eeb-49e0-86ef-30222e5926aa"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.142990 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4k7j\" (UniqueName: \"kubernetes.io/projected/9af86f4e-8143-426d-98a6-b59bde2a6247-kube-api-access-q4k7j\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143045 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-config\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143072 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-client-ca\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143093 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9af86f4e-8143-426d-98a6-b59bde2a6247-serving-cert\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143156 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-proxy-ca-bundles\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143215 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143226 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2mns\" (UniqueName: \"kubernetes.io/projected/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-kube-api-access-x2mns\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143237 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7km6k\" (UniqueName: \"kubernetes.io/projected/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-kube-api-access-7km6k\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143246 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143254 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143264 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143273 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143281 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143290 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.144436 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-client-ca\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.144507 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-config\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.144814 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-proxy-ca-bundles\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.148172 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9af86f4e-8143-426d-98a6-b59bde2a6247-serving-cert\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.159459 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4k7j\" (UniqueName: \"kubernetes.io/projected/9af86f4e-8143-426d-98a6-b59bde2a6247-kube-api-access-q4k7j\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.165361 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.165357 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" event={"ID":"a3c9b59a-0eeb-49e0-86ef-30222e5926aa","Type":"ContainerDied","Data":"d7a58daa62b7a3f44dc2a8d87fb35984d20d34c979da845c5833bab0d1c0d7f2"} Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.165491 4837 scope.go:117] "RemoveContainer" containerID="04c1b4cdc66c99fd06f47d8f53d5a9118d0695a5ac3f712471886566797fee43" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.167010 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" event={"ID":"12e5f732-00c7-49ae-9e3e-121aa7caa6ee","Type":"ContainerDied","Data":"351e128a579d7bea389593621f8531499b1484f659ddbed7034ee720b2bb6945"} Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.167054 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:52:13 crc kubenswrapper[4837]: E0313 11:52:13.168914 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29556710-lcprh" podUID="0484d991-f239-47a2-80ff-0237945c27ac" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.194061 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9"] Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.197384 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9"] Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.203125 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-cf76c7dc-qtd9h"] Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.206984 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-cf76c7dc-qtd9h"] Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.280805 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.328863 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:14 crc kubenswrapper[4837]: I0313 11:52:14.410548 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 11:52:14 crc kubenswrapper[4837]: I0313 11:52:14.411611 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:52:14 crc kubenswrapper[4837]: I0313 11:52:14.414311 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 11:52:14 crc kubenswrapper[4837]: I0313 11:52:14.414513 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 13 11:52:14 crc kubenswrapper[4837]: I0313 11:52:14.421435 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 11:52:14 crc kubenswrapper[4837]: I0313 11:52:14.462383 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0db827be-f908-46a8-9402-a858214284e7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0db827be-f908-46a8-9402-a858214284e7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:52:14 crc kubenswrapper[4837]: I0313 11:52:14.462439 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0db827be-f908-46a8-9402-a858214284e7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0db827be-f908-46a8-9402-a858214284e7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:52:14 crc kubenswrapper[4837]: I0313 11:52:14.563209 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0db827be-f908-46a8-9402-a858214284e7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0db827be-f908-46a8-9402-a858214284e7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:52:14 crc kubenswrapper[4837]: I0313 11:52:14.563571 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0db827be-f908-46a8-9402-a858214284e7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0db827be-f908-46a8-9402-a858214284e7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:52:14 crc kubenswrapper[4837]: I0313 11:52:14.564605 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0db827be-f908-46a8-9402-a858214284e7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0db827be-f908-46a8-9402-a858214284e7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:52:14 crc kubenswrapper[4837]: I0313 11:52:14.583698 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0db827be-f908-46a8-9402-a858214284e7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0db827be-f908-46a8-9402-a858214284e7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:52:14 crc kubenswrapper[4837]: I0313 11:52:14.742559 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.056270 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12e5f732-00c7-49ae-9e3e-121aa7caa6ee" path="/var/lib/kubelet/pods/12e5f732-00c7-49ae-9e3e-121aa7caa6ee/volumes" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.057067 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3c9b59a-0eeb-49e0-86ef-30222e5926aa" path="/var/lib/kubelet/pods/a3c9b59a-0eeb-49e0-86ef-30222e5926aa/volumes" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.527656 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5"] Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.549371 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5"] Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.549477 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.551898 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.552160 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.552370 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.553181 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.553388 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.553592 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.576451 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krwzg\" (UniqueName: \"kubernetes.io/projected/94a9fa12-c97d-4b13-81a1-da33f15c7f42-kube-api-access-krwzg\") pod \"route-controller-manager-588697dd78-t4tn5\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.576504 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94a9fa12-c97d-4b13-81a1-da33f15c7f42-serving-cert\") pod \"route-controller-manager-588697dd78-t4tn5\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.576566 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94a9fa12-c97d-4b13-81a1-da33f15c7f42-client-ca\") pod \"route-controller-manager-588697dd78-t4tn5\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.576619 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a9fa12-c97d-4b13-81a1-da33f15c7f42-config\") pod \"route-controller-manager-588697dd78-t4tn5\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.677888 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a9fa12-c97d-4b13-81a1-da33f15c7f42-config\") pod \"route-controller-manager-588697dd78-t4tn5\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.677953 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krwzg\" (UniqueName: \"kubernetes.io/projected/94a9fa12-c97d-4b13-81a1-da33f15c7f42-kube-api-access-krwzg\") pod \"route-controller-manager-588697dd78-t4tn5\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.677975 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94a9fa12-c97d-4b13-81a1-da33f15c7f42-serving-cert\") pod \"route-controller-manager-588697dd78-t4tn5\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.678017 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94a9fa12-c97d-4b13-81a1-da33f15c7f42-client-ca\") pod \"route-controller-manager-588697dd78-t4tn5\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.678746 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94a9fa12-c97d-4b13-81a1-da33f15c7f42-client-ca\") pod \"route-controller-manager-588697dd78-t4tn5\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.680450 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a9fa12-c97d-4b13-81a1-da33f15c7f42-config\") pod \"route-controller-manager-588697dd78-t4tn5\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.682909 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94a9fa12-c97d-4b13-81a1-da33f15c7f42-serving-cert\") pod \"route-controller-manager-588697dd78-t4tn5\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.697123 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krwzg\" (UniqueName: \"kubernetes.io/projected/94a9fa12-c97d-4b13-81a1-da33f15c7f42-kube-api-access-krwzg\") pod \"route-controller-manager-588697dd78-t4tn5\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.868428 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:16 crc kubenswrapper[4837]: E0313 11:52:16.791273 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 13 11:52:16 crc kubenswrapper[4837]: E0313 11:52:16.791683 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xx4zq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-j246z_openshift-marketplace(32a36cbe-a17f-46bf-9c6a-1df6f427e2c6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 11:52:16 crc kubenswrapper[4837]: E0313 11:52:16.792880 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-j246z" podUID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" Mar 13 11:52:18 crc kubenswrapper[4837]: I0313 11:52:18.121653 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dfc58dd94-n92qv"] Mar 13 11:52:18 crc kubenswrapper[4837]: E0313 11:52:18.165044 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-j246z" podUID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" Mar 13 11:52:18 crc kubenswrapper[4837]: I0313 11:52:18.220107 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5"] Mar 13 11:52:18 crc kubenswrapper[4837]: E0313 11:52:18.222772 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 13 11:52:18 crc kubenswrapper[4837]: E0313 11:52:18.222933 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l42fc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-jspgm_openshift-marketplace(5236ae0e-b305-4f1c-9125-bbac1eeb07f3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 11:52:18 crc kubenswrapper[4837]: E0313 11:52:18.224201 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-jspgm" podUID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.006674 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.007584 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.018896 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.122038 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/def1c7aa-51a6-4ee0-93d5-714721e9fc27-kube-api-access\") pod \"installer-9-crc\" (UID: \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.122099 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/def1c7aa-51a6-4ee0-93d5-714721e9fc27-var-lock\") pod \"installer-9-crc\" (UID: \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.122205 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/def1c7aa-51a6-4ee0-93d5-714721e9fc27-kubelet-dir\") pod \"installer-9-crc\" (UID: \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.223066 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/def1c7aa-51a6-4ee0-93d5-714721e9fc27-kube-api-access\") pod \"installer-9-crc\" (UID: \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.223119 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/def1c7aa-51a6-4ee0-93d5-714721e9fc27-var-lock\") pod \"installer-9-crc\" (UID: \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.223149 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/def1c7aa-51a6-4ee0-93d5-714721e9fc27-kubelet-dir\") pod \"installer-9-crc\" (UID: \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.223285 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/def1c7aa-51a6-4ee0-93d5-714721e9fc27-var-lock\") pod \"installer-9-crc\" (UID: \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.223391 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/def1c7aa-51a6-4ee0-93d5-714721e9fc27-kubelet-dir\") pod \"installer-9-crc\" (UID: \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.241208 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/def1c7aa-51a6-4ee0-93d5-714721e9fc27-kube-api-access\") pod \"installer-9-crc\" (UID: \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.336261 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.721777 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-jspgm" podUID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.818494 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.818670 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xmfv2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-vx4r8_openshift-marketplace(45e6ae52-59ef-446f-917a-549d34ffbf8e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.819751 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.819774 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-vx4r8" podUID="45e6ae52-59ef-446f-917a-549d34ffbf8e" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.819847 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bcgkl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-twtbj_openshift-marketplace(278c91cc-2624-42cd-a35e-287e22d22f7d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.820993 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-twtbj" podUID="278c91cc-2624-42cd-a35e-287e22d22f7d" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.854331 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.854470 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vpclz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-ng6kk_openshift-marketplace(bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.855625 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-ng6kk" podUID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.861778 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.861911 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wfddm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-7crb6_openshift-marketplace(080747b0-3d43-4ff1-b21c-b8ea9fc2f961): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.863096 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-7crb6" podUID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" Mar 13 11:52:21 crc kubenswrapper[4837]: E0313 11:52:21.331333 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-7crb6" podUID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" Mar 13 11:52:21 crc kubenswrapper[4837]: E0313 11:52:21.331440 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-ng6kk" podUID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" Mar 13 11:52:21 crc kubenswrapper[4837]: E0313 11:52:21.331446 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-vx4r8" podUID="45e6ae52-59ef-446f-917a-549d34ffbf8e" Mar 13 11:52:21 crc kubenswrapper[4837]: E0313 11:52:21.331511 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-twtbj" podUID="278c91cc-2624-42cd-a35e-287e22d22f7d" Mar 13 11:52:21 crc kubenswrapper[4837]: I0313 11:52:21.372340 4837 scope.go:117] "RemoveContainer" containerID="b82fa6f2134589dee51636289f2f8c0ff8d4c77d04a184b0382c40aa9a2b8bdc" Mar 13 11:52:21 crc kubenswrapper[4837]: E0313 11:52:21.415213 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 13 11:52:21 crc kubenswrapper[4837]: E0313 11:52:21.415412 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gfvgs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-ft6cr_openshift-marketplace(e6060cf2-077e-4112-af57-f100e297f320): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 11:52:21 crc kubenswrapper[4837]: E0313 11:52:21.416561 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-ft6cr" podUID="e6060cf2-077e-4112-af57-f100e297f320" Mar 13 11:52:21 crc kubenswrapper[4837]: E0313 11:52:21.470214 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 13 11:52:21 crc kubenswrapper[4837]: E0313 11:52:21.470404 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4rpl9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-5tnrx_openshift-marketplace(6870caea-07d6-4465-86b1-645a2e29b240): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 11:52:21 crc kubenswrapper[4837]: E0313 11:52:21.471850 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-5tnrx" podUID="6870caea-07d6-4465-86b1-645a2e29b240" Mar 13 11:52:21 crc kubenswrapper[4837]: I0313 11:52:21.779863 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556712-g8877"] Mar 13 11:52:21 crc kubenswrapper[4837]: I0313 11:52:21.784405 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5"] Mar 13 11:52:21 crc kubenswrapper[4837]: W0313 11:52:21.790893 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87edec8a_33b2_44c0_bbcb_1e4f5dded1b2.slice/crio-a21cd2ecc8f429b49f1437604fc5e32c63e6746f08d9a4b8da352497b132669f WatchSource:0}: Error finding container a21cd2ecc8f429b49f1437604fc5e32c63e6746f08d9a4b8da352497b132669f: Status 404 returned error can't find the container with id a21cd2ecc8f429b49f1437604fc5e32c63e6746f08d9a4b8da352497b132669f Mar 13 11:52:21 crc kubenswrapper[4837]: I0313 11:52:21.844280 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dfc58dd94-n92qv"] Mar 13 11:52:21 crc kubenswrapper[4837]: W0313 11:52:21.850987 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9af86f4e_8143_426d_98a6_b59bde2a6247.slice/crio-0a64770191264774b60b3b0035ecdb4182d299c43a1bebd3bdb1f2e9e4efd51c WatchSource:0}: Error finding container 0a64770191264774b60b3b0035ecdb4182d299c43a1bebd3bdb1f2e9e4efd51c: Status 404 returned error can't find the container with id 0a64770191264774b60b3b0035ecdb4182d299c43a1bebd3bdb1f2e9e4efd51c Mar 13 11:52:21 crc kubenswrapper[4837]: I0313 11:52:21.899555 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 11:52:21 crc kubenswrapper[4837]: I0313 11:52:21.910031 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 11:52:21 crc kubenswrapper[4837]: W0313 11:52:21.912416 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0db827be_f908_46a8_9402_a858214284e7.slice/crio-dbdd7156061fda7162f147ca816854574c46fd0ae13942bad0a58567040c30eb WatchSource:0}: Error finding container dbdd7156061fda7162f147ca816854574c46fd0ae13942bad0a58567040c30eb: Status 404 returned error can't find the container with id dbdd7156061fda7162f147ca816854574c46fd0ae13942bad0a58567040c30eb Mar 13 11:52:21 crc kubenswrapper[4837]: W0313 11:52:21.928889 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddef1c7aa_51a6_4ee0_93d5_714721e9fc27.slice/crio-971c2474cd87ef98e0d1f40d16055d082ae34c7f72c4c883a6feb86fd0ff4ce0 WatchSource:0}: Error finding container 971c2474cd87ef98e0d1f40d16055d082ae34c7f72c4c883a6feb86fd0ff4ce0: Status 404 returned error can't find the container with id 971c2474cd87ef98e0d1f40d16055d082ae34c7f72c4c883a6feb86fd0ff4ce0 Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.212531 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" event={"ID":"94a9fa12-c97d-4b13-81a1-da33f15c7f42","Type":"ContainerStarted","Data":"156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9"} Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.212594 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" podUID="94a9fa12-c97d-4b13-81a1-da33f15c7f42" containerName="route-controller-manager" containerID="cri-o://156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9" gracePeriod=30 Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.212663 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.212682 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" event={"ID":"94a9fa12-c97d-4b13-81a1-da33f15c7f42","Type":"ContainerStarted","Data":"e8304698d4bf5325cbeebf58e902f92321fa9a970fc375c2b81310795d7c51bb"} Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.216050 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0db827be-f908-46a8-9402-a858214284e7","Type":"ContainerStarted","Data":"1302ef186a9c607b00444c8a5a974e46bcf1b9f7828a0fc6da6c8957d4d4e5cd"} Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.216095 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0db827be-f908-46a8-9402-a858214284e7","Type":"ContainerStarted","Data":"dbdd7156061fda7162f147ca816854574c46fd0ae13942bad0a58567040c30eb"} Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.221031 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"def1c7aa-51a6-4ee0-93d5-714721e9fc27","Type":"ContainerStarted","Data":"65e28d9ae9393725ced85e7d2690513d16c23a3765c0987cd0c005a1bb7bef87"} Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.221085 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"def1c7aa-51a6-4ee0-93d5-714721e9fc27","Type":"ContainerStarted","Data":"971c2474cd87ef98e0d1f40d16055d082ae34c7f72c4c883a6feb86fd0ff4ce0"} Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.223413 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556712-g8877" event={"ID":"87edec8a-33b2-44c0-bbcb-1e4f5dded1b2","Type":"ContainerStarted","Data":"a21cd2ecc8f429b49f1437604fc5e32c63e6746f08d9a4b8da352497b132669f"} Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.226088 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" event={"ID":"9af86f4e-8143-426d-98a6-b59bde2a6247","Type":"ContainerStarted","Data":"e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0"} Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.226157 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" event={"ID":"9af86f4e-8143-426d-98a6-b59bde2a6247","Type":"ContainerStarted","Data":"0a64770191264774b60b3b0035ecdb4182d299c43a1bebd3bdb1f2e9e4efd51c"} Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.226305 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" podUID="9af86f4e-8143-426d-98a6-b59bde2a6247" containerName="controller-manager" containerID="cri-o://e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0" gracePeriod=30 Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.226801 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:22 crc kubenswrapper[4837]: E0313 11:52:22.227967 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-5tnrx" podUID="6870caea-07d6-4465-86b1-645a2e29b240" Mar 13 11:52:22 crc kubenswrapper[4837]: E0313 11:52:22.227989 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-ft6cr" podUID="e6060cf2-077e-4112-af57-f100e297f320" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.234506 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" podStartSLOduration=24.234487848 podStartE2EDuration="24.234487848s" podCreationTimestamp="2026-03-13 11:51:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:52:22.231468243 +0000 UTC m=+257.869735016" watchObservedRunningTime="2026-03-13 11:52:22.234487848 +0000 UTC m=+257.872754621" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.252187 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=4.252164652 podStartE2EDuration="4.252164652s" podCreationTimestamp="2026-03-13 11:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:52:22.245600806 +0000 UTC m=+257.883867569" watchObservedRunningTime="2026-03-13 11:52:22.252164652 +0000 UTC m=+257.890431415" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.263562 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" podStartSLOduration=24.263547228 podStartE2EDuration="24.263547228s" podCreationTimestamp="2026-03-13 11:51:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:52:22.26330042 +0000 UTC m=+257.901567183" watchObservedRunningTime="2026-03-13 11:52:22.263547228 +0000 UTC m=+257.901813991" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.290303 4837 patch_prober.go:28] interesting pod/controller-manager-6dfc58dd94-n92qv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": EOF" start-of-body= Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.290365 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" podUID="9af86f4e-8143-426d-98a6-b59bde2a6247" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": EOF" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.312007 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=8.311987254 podStartE2EDuration="8.311987254s" podCreationTimestamp="2026-03-13 11:52:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:52:22.293470905 +0000 UTC m=+257.931737668" watchObservedRunningTime="2026-03-13 11:52:22.311987254 +0000 UTC m=+257.950254017" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.544861 4837 patch_prober.go:28] interesting pod/route-controller-manager-588697dd78-t4tn5 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": read tcp 10.217.0.2:59940->10.217.0.61:8443: read: connection reset by peer" start-of-body= Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.545212 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" podUID="94a9fa12-c97d-4b13-81a1-da33f15c7f42" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": read tcp 10.217.0.2:59940->10.217.0.61:8443: read: connection reset by peer" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.685410 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.714776 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s"] Mar 13 11:52:22 crc kubenswrapper[4837]: E0313 11:52:22.715016 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9af86f4e-8143-426d-98a6-b59bde2a6247" containerName="controller-manager" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.715031 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af86f4e-8143-426d-98a6-b59bde2a6247" containerName="controller-manager" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.715177 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9af86f4e-8143-426d-98a6-b59bde2a6247" containerName="controller-manager" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.715608 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.724036 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s"] Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.847305 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-588697dd78-t4tn5_94a9fa12-c97d-4b13-81a1-da33f15c7f42/route-controller-manager/0.log" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.847571 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.866956 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4k7j\" (UniqueName: \"kubernetes.io/projected/9af86f4e-8143-426d-98a6-b59bde2a6247-kube-api-access-q4k7j\") pod \"9af86f4e-8143-426d-98a6-b59bde2a6247\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.867189 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-client-ca\") pod \"9af86f4e-8143-426d-98a6-b59bde2a6247\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.867231 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-config\") pod \"9af86f4e-8143-426d-98a6-b59bde2a6247\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.867316 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-proxy-ca-bundles\") pod \"9af86f4e-8143-426d-98a6-b59bde2a6247\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.867419 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9af86f4e-8143-426d-98a6-b59bde2a6247-serving-cert\") pod \"9af86f4e-8143-426d-98a6-b59bde2a6247\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.867766 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f85724b-0c9e-4a01-927a-0054866f46d5-serving-cert\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.867948 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-client-ca\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.868110 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jdgb\" (UniqueName: \"kubernetes.io/projected/2f85724b-0c9e-4a01-927a-0054866f46d5-kube-api-access-6jdgb\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.868172 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-config\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.868210 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-proxy-ca-bundles\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.868501 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-client-ca" (OuterVolumeSpecName: "client-ca") pod "9af86f4e-8143-426d-98a6-b59bde2a6247" (UID: "9af86f4e-8143-426d-98a6-b59bde2a6247"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.868561 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-config" (OuterVolumeSpecName: "config") pod "9af86f4e-8143-426d-98a6-b59bde2a6247" (UID: "9af86f4e-8143-426d-98a6-b59bde2a6247"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.868865 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9af86f4e-8143-426d-98a6-b59bde2a6247" (UID: "9af86f4e-8143-426d-98a6-b59bde2a6247"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.874121 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9af86f4e-8143-426d-98a6-b59bde2a6247-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9af86f4e-8143-426d-98a6-b59bde2a6247" (UID: "9af86f4e-8143-426d-98a6-b59bde2a6247"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.874330 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9af86f4e-8143-426d-98a6-b59bde2a6247-kube-api-access-q4k7j" (OuterVolumeSpecName: "kube-api-access-q4k7j") pod "9af86f4e-8143-426d-98a6-b59bde2a6247" (UID: "9af86f4e-8143-426d-98a6-b59bde2a6247"). InnerVolumeSpecName "kube-api-access-q4k7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.970401 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94a9fa12-c97d-4b13-81a1-da33f15c7f42-client-ca\") pod \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.970464 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a9fa12-c97d-4b13-81a1-da33f15c7f42-config\") pod \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.970527 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krwzg\" (UniqueName: \"kubernetes.io/projected/94a9fa12-c97d-4b13-81a1-da33f15c7f42-kube-api-access-krwzg\") pod \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.970568 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94a9fa12-c97d-4b13-81a1-da33f15c7f42-serving-cert\") pod \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.970748 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jdgb\" (UniqueName: \"kubernetes.io/projected/2f85724b-0c9e-4a01-927a-0054866f46d5-kube-api-access-6jdgb\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.970791 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-config\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.970815 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-proxy-ca-bundles\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.970854 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f85724b-0c9e-4a01-927a-0054866f46d5-serving-cert\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.970896 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-client-ca\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.970964 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.970979 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9af86f4e-8143-426d-98a6-b59bde2a6247-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.970992 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4k7j\" (UniqueName: \"kubernetes.io/projected/9af86f4e-8143-426d-98a6-b59bde2a6247-kube-api-access-q4k7j\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.971019 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.971030 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.971465 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94a9fa12-c97d-4b13-81a1-da33f15c7f42-client-ca" (OuterVolumeSpecName: "client-ca") pod "94a9fa12-c97d-4b13-81a1-da33f15c7f42" (UID: "94a9fa12-c97d-4b13-81a1-da33f15c7f42"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.971701 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94a9fa12-c97d-4b13-81a1-da33f15c7f42-config" (OuterVolumeSpecName: "config") pod "94a9fa12-c97d-4b13-81a1-da33f15c7f42" (UID: "94a9fa12-c97d-4b13-81a1-da33f15c7f42"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.972257 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-proxy-ca-bundles\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.972479 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-config\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.973279 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-client-ca\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.976326 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a9fa12-c97d-4b13-81a1-da33f15c7f42-kube-api-access-krwzg" (OuterVolumeSpecName: "kube-api-access-krwzg") pod "94a9fa12-c97d-4b13-81a1-da33f15c7f42" (UID: "94a9fa12-c97d-4b13-81a1-da33f15c7f42"). InnerVolumeSpecName "kube-api-access-krwzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.976458 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94a9fa12-c97d-4b13-81a1-da33f15c7f42-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "94a9fa12-c97d-4b13-81a1-da33f15c7f42" (UID: "94a9fa12-c97d-4b13-81a1-da33f15c7f42"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.977002 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f85724b-0c9e-4a01-927a-0054866f46d5-serving-cert\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.989606 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jdgb\" (UniqueName: \"kubernetes.io/projected/2f85724b-0c9e-4a01-927a-0054866f46d5-kube-api-access-6jdgb\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.072863 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krwzg\" (UniqueName: \"kubernetes.io/projected/94a9fa12-c97d-4b13-81a1-da33f15c7f42-kube-api-access-krwzg\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.073899 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94a9fa12-c97d-4b13-81a1-da33f15c7f42-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.073923 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94a9fa12-c97d-4b13-81a1-da33f15c7f42-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.073938 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a9fa12-c97d-4b13-81a1-da33f15c7f42-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.104999 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.158795 4837 csr.go:261] certificate signing request csr-qq6w7 is approved, waiting to be issued Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.163933 4837 csr.go:257] certificate signing request csr-qq6w7 is issued Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.237365 4837 generic.go:334] "Generic (PLEG): container finished" podID="87edec8a-33b2-44c0-bbcb-1e4f5dded1b2" containerID="8c4d75bce91d26c5c90ccce3126b557507017a92b0dd1db884cee46957fc8b2f" exitCode=0 Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.237423 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556712-g8877" event={"ID":"87edec8a-33b2-44c0-bbcb-1e4f5dded1b2","Type":"ContainerDied","Data":"8c4d75bce91d26c5c90ccce3126b557507017a92b0dd1db884cee46957fc8b2f"} Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.239310 4837 generic.go:334] "Generic (PLEG): container finished" podID="9af86f4e-8143-426d-98a6-b59bde2a6247" containerID="e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0" exitCode=0 Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.239361 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" event={"ID":"9af86f4e-8143-426d-98a6-b59bde2a6247","Type":"ContainerDied","Data":"e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0"} Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.239394 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.239432 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" event={"ID":"9af86f4e-8143-426d-98a6-b59bde2a6247","Type":"ContainerDied","Data":"0a64770191264774b60b3b0035ecdb4182d299c43a1bebd3bdb1f2e9e4efd51c"} Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.239453 4837 scope.go:117] "RemoveContainer" containerID="e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.242315 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-588697dd78-t4tn5_94a9fa12-c97d-4b13-81a1-da33f15c7f42/route-controller-manager/0.log" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.242349 4837 generic.go:334] "Generic (PLEG): container finished" podID="94a9fa12-c97d-4b13-81a1-da33f15c7f42" containerID="156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9" exitCode=255 Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.242389 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" event={"ID":"94a9fa12-c97d-4b13-81a1-da33f15c7f42","Type":"ContainerDied","Data":"156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9"} Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.242416 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" event={"ID":"94a9fa12-c97d-4b13-81a1-da33f15c7f42","Type":"ContainerDied","Data":"e8304698d4bf5325cbeebf58e902f92321fa9a970fc375c2b81310795d7c51bb"} Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.242467 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.245802 4837 generic.go:334] "Generic (PLEG): container finished" podID="0db827be-f908-46a8-9402-a858214284e7" containerID="1302ef186a9c607b00444c8a5a974e46bcf1b9f7828a0fc6da6c8957d4d4e5cd" exitCode=0 Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.245870 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0db827be-f908-46a8-9402-a858214284e7","Type":"ContainerDied","Data":"1302ef186a9c607b00444c8a5a974e46bcf1b9f7828a0fc6da6c8957d4d4e5cd"} Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.267995 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dfc58dd94-n92qv"] Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.270615 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6dfc58dd94-n92qv"] Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.274038 4837 scope.go:117] "RemoveContainer" containerID="e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0" Mar 13 11:52:23 crc kubenswrapper[4837]: E0313 11:52:23.276115 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0\": container with ID starting with e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0 not found: ID does not exist" containerID="e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.276172 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0"} err="failed to get container status \"e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0\": rpc error: code = NotFound desc = could not find container \"e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0\": container with ID starting with e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0 not found: ID does not exist" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.276198 4837 scope.go:117] "RemoveContainer" containerID="156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.293231 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s"] Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.298988 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5"] Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.301427 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5"] Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.303827 4837 scope.go:117] "RemoveContainer" containerID="156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9" Mar 13 11:52:23 crc kubenswrapper[4837]: E0313 11:52:23.304798 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9\": container with ID starting with 156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9 not found: ID does not exist" containerID="156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.304828 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9"} err="failed to get container status \"156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9\": rpc error: code = NotFound desc = could not find container \"156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9\": container with ID starting with 156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9 not found: ID does not exist" Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.166473 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-09 20:52:31.936490827 +0000 UTC Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.166513 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6513h0m7.769979994s for next certificate rotation Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.253505 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" event={"ID":"2f85724b-0c9e-4a01-927a-0054866f46d5","Type":"ContainerStarted","Data":"a26cbf0bbee08d073239d4a9f9f827d33e581ba8c5b8a64d42ab65f1206a8910"} Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.253550 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" event={"ID":"2f85724b-0c9e-4a01-927a-0054866f46d5","Type":"ContainerStarted","Data":"a1d4f83811bf42ca7944c4333709cb3c7a2ea535fb4a2466c48aaccfc4a847ba"} Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.253998 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.267887 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.280350 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" podStartSLOduration=6.280327626 podStartE2EDuration="6.280327626s" podCreationTimestamp="2026-03-13 11:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:52:24.274548805 +0000 UTC m=+259.912815578" watchObservedRunningTime="2026-03-13 11:52:24.280327626 +0000 UTC m=+259.918594389" Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.510264 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556712-g8877" Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.578494 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.593977 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0db827be-f908-46a8-9402-a858214284e7-kubelet-dir\") pod \"0db827be-f908-46a8-9402-a858214284e7\" (UID: \"0db827be-f908-46a8-9402-a858214284e7\") " Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.594073 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0db827be-f908-46a8-9402-a858214284e7-kube-api-access\") pod \"0db827be-f908-46a8-9402-a858214284e7\" (UID: \"0db827be-f908-46a8-9402-a858214284e7\") " Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.594082 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0db827be-f908-46a8-9402-a858214284e7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0db827be-f908-46a8-9402-a858214284e7" (UID: "0db827be-f908-46a8-9402-a858214284e7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.594126 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xqlb\" (UniqueName: \"kubernetes.io/projected/87edec8a-33b2-44c0-bbcb-1e4f5dded1b2-kube-api-access-9xqlb\") pod \"87edec8a-33b2-44c0-bbcb-1e4f5dded1b2\" (UID: \"87edec8a-33b2-44c0-bbcb-1e4f5dded1b2\") " Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.594305 4837 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0db827be-f908-46a8-9402-a858214284e7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.642147 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0db827be-f908-46a8-9402-a858214284e7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0db827be-f908-46a8-9402-a858214284e7" (UID: "0db827be-f908-46a8-9402-a858214284e7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.642327 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87edec8a-33b2-44c0-bbcb-1e4f5dded1b2-kube-api-access-9xqlb" (OuterVolumeSpecName: "kube-api-access-9xqlb") pod "87edec8a-33b2-44c0-bbcb-1e4f5dded1b2" (UID: "87edec8a-33b2-44c0-bbcb-1e4f5dded1b2"). InnerVolumeSpecName "kube-api-access-9xqlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.695051 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0db827be-f908-46a8-9402-a858214284e7-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.695095 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xqlb\" (UniqueName: \"kubernetes.io/projected/87edec8a-33b2-44c0-bbcb-1e4f5dded1b2-kube-api-access-9xqlb\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.057993 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94a9fa12-c97d-4b13-81a1-da33f15c7f42" path="/var/lib/kubelet/pods/94a9fa12-c97d-4b13-81a1-da33f15c7f42/volumes" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.058842 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9af86f4e-8143-426d-98a6-b59bde2a6247" path="/var/lib/kubelet/pods/9af86f4e-8143-426d-98a6-b59bde2a6247/volumes" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.166777 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-16 19:25:23.776019141 +0000 UTC Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.166835 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 5959h32m58.609186902s for next certificate rotation Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.273456 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0db827be-f908-46a8-9402-a858214284e7","Type":"ContainerDied","Data":"dbdd7156061fda7162f147ca816854574c46fd0ae13942bad0a58567040c30eb"} Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.273501 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbdd7156061fda7162f147ca816854574c46fd0ae13942bad0a58567040c30eb" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.273567 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.275331 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556712-g8877" event={"ID":"87edec8a-33b2-44c0-bbcb-1e4f5dded1b2","Type":"ContainerDied","Data":"a21cd2ecc8f429b49f1437604fc5e32c63e6746f08d9a4b8da352497b132669f"} Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.275387 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a21cd2ecc8f429b49f1437604fc5e32c63e6746f08d9a4b8da352497b132669f" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.275352 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556712-g8877" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.530974 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j"] Mar 13 11:52:25 crc kubenswrapper[4837]: E0313 11:52:25.531256 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87edec8a-33b2-44c0-bbcb-1e4f5dded1b2" containerName="oc" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.531278 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="87edec8a-33b2-44c0-bbcb-1e4f5dded1b2" containerName="oc" Mar 13 11:52:25 crc kubenswrapper[4837]: E0313 11:52:25.531295 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a9fa12-c97d-4b13-81a1-da33f15c7f42" containerName="route-controller-manager" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.531303 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a9fa12-c97d-4b13-81a1-da33f15c7f42" containerName="route-controller-manager" Mar 13 11:52:25 crc kubenswrapper[4837]: E0313 11:52:25.531323 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0db827be-f908-46a8-9402-a858214284e7" containerName="pruner" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.531331 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0db827be-f908-46a8-9402-a858214284e7" containerName="pruner" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.531445 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="94a9fa12-c97d-4b13-81a1-da33f15c7f42" containerName="route-controller-manager" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.531462 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0db827be-f908-46a8-9402-a858214284e7" containerName="pruner" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.531478 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="87edec8a-33b2-44c0-bbcb-1e4f5dded1b2" containerName="oc" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.531905 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.533585 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.534060 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.534407 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.534582 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.534756 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.534946 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.558877 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j"] Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.710008 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-serving-cert\") pod \"route-controller-manager-cc969d968-65b4j\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.710077 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-config\") pod \"route-controller-manager-cc969d968-65b4j\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.710104 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqdrx\" (UniqueName: \"kubernetes.io/projected/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-kube-api-access-fqdrx\") pod \"route-controller-manager-cc969d968-65b4j\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.710127 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-client-ca\") pod \"route-controller-manager-cc969d968-65b4j\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.812333 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-serving-cert\") pod \"route-controller-manager-cc969d968-65b4j\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.812439 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-config\") pod \"route-controller-manager-cc969d968-65b4j\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.812481 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqdrx\" (UniqueName: \"kubernetes.io/projected/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-kube-api-access-fqdrx\") pod \"route-controller-manager-cc969d968-65b4j\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.812515 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-client-ca\") pod \"route-controller-manager-cc969d968-65b4j\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.814197 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-client-ca\") pod \"route-controller-manager-cc969d968-65b4j\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.814327 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-config\") pod \"route-controller-manager-cc969d968-65b4j\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.816952 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-serving-cert\") pod \"route-controller-manager-cc969d968-65b4j\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.831015 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqdrx\" (UniqueName: \"kubernetes.io/projected/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-kube-api-access-fqdrx\") pod \"route-controller-manager-cc969d968-65b4j\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.860214 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:26 crc kubenswrapper[4837]: I0313 11:52:26.248339 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j"] Mar 13 11:52:26 crc kubenswrapper[4837]: W0313 11:52:26.258080 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c0f7bce_90d1_4fe8_a832_8c4f55efd886.slice/crio-9be860a81b1b95a82b5cdc8cfccb2c34c9083d28d8fc3fa8cb17545d34f6a331 WatchSource:0}: Error finding container 9be860a81b1b95a82b5cdc8cfccb2c34c9083d28d8fc3fa8cb17545d34f6a331: Status 404 returned error can't find the container with id 9be860a81b1b95a82b5cdc8cfccb2c34c9083d28d8fc3fa8cb17545d34f6a331 Mar 13 11:52:26 crc kubenswrapper[4837]: I0313 11:52:26.282113 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" event={"ID":"1c0f7bce-90d1-4fe8-a832-8c4f55efd886","Type":"ContainerStarted","Data":"9be860a81b1b95a82b5cdc8cfccb2c34c9083d28d8fc3fa8cb17545d34f6a331"} Mar 13 11:52:27 crc kubenswrapper[4837]: I0313 11:52:27.288983 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" event={"ID":"1c0f7bce-90d1-4fe8-a832-8c4f55efd886","Type":"ContainerStarted","Data":"c5b04d576ec724fb239a6b87808e44a211c810b2a1f9473b6bf7e374d0f380ef"} Mar 13 11:52:27 crc kubenswrapper[4837]: I0313 11:52:27.289243 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:27 crc kubenswrapper[4837]: I0313 11:52:27.295577 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:27 crc kubenswrapper[4837]: I0313 11:52:27.306171 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" podStartSLOduration=9.306155286 podStartE2EDuration="9.306155286s" podCreationTimestamp="2026-03-13 11:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:52:27.304040026 +0000 UTC m=+262.942306789" watchObservedRunningTime="2026-03-13 11:52:27.306155286 +0000 UTC m=+262.944422049" Mar 13 11:52:30 crc kubenswrapper[4837]: I0313 11:52:30.308386 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556710-lcprh" event={"ID":"0484d991-f239-47a2-80ff-0237945c27ac","Type":"ContainerStarted","Data":"b2ba4ee22041e914a3b1573c300fce67d3ac337c4d9b3d85c86421a82bc9711f"} Mar 13 11:52:30 crc kubenswrapper[4837]: I0313 11:52:30.323381 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556710-lcprh" podStartSLOduration=94.271420207 podStartE2EDuration="2m30.323357475s" podCreationTimestamp="2026-03-13 11:50:00 +0000 UTC" firstStartedPulling="2026-03-13 11:51:33.87443962 +0000 UTC m=+209.512706383" lastFinishedPulling="2026-03-13 11:52:29.926376888 +0000 UTC m=+265.564643651" observedRunningTime="2026-03-13 11:52:30.320144101 +0000 UTC m=+265.958410864" watchObservedRunningTime="2026-03-13 11:52:30.323357475 +0000 UTC m=+265.961624238" Mar 13 11:52:31 crc kubenswrapper[4837]: I0313 11:52:31.317145 4837 generic.go:334] "Generic (PLEG): container finished" podID="0484d991-f239-47a2-80ff-0237945c27ac" containerID="b2ba4ee22041e914a3b1573c300fce67d3ac337c4d9b3d85c86421a82bc9711f" exitCode=0 Mar 13 11:52:31 crc kubenswrapper[4837]: I0313 11:52:31.317239 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556710-lcprh" event={"ID":"0484d991-f239-47a2-80ff-0237945c27ac","Type":"ContainerDied","Data":"b2ba4ee22041e914a3b1573c300fce67d3ac337c4d9b3d85c86421a82bc9711f"} Mar 13 11:52:31 crc kubenswrapper[4837]: I0313 11:52:31.319271 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j246z" event={"ID":"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6","Type":"ContainerStarted","Data":"73bd112e9625df12ee2d18dc3c732843142d27d7ee69bd7e78c3b55fe032dc84"} Mar 13 11:52:32 crc kubenswrapper[4837]: I0313 11:52:32.327030 4837 generic.go:334] "Generic (PLEG): container finished" podID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" containerID="73bd112e9625df12ee2d18dc3c732843142d27d7ee69bd7e78c3b55fe032dc84" exitCode=0 Mar 13 11:52:32 crc kubenswrapper[4837]: I0313 11:52:32.327125 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j246z" event={"ID":"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6","Type":"ContainerDied","Data":"73bd112e9625df12ee2d18dc3c732843142d27d7ee69bd7e78c3b55fe032dc84"} Mar 13 11:52:32 crc kubenswrapper[4837]: I0313 11:52:32.722863 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556710-lcprh" Mar 13 11:52:32 crc kubenswrapper[4837]: I0313 11:52:32.829156 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlwdw\" (UniqueName: \"kubernetes.io/projected/0484d991-f239-47a2-80ff-0237945c27ac-kube-api-access-dlwdw\") pod \"0484d991-f239-47a2-80ff-0237945c27ac\" (UID: \"0484d991-f239-47a2-80ff-0237945c27ac\") " Mar 13 11:52:32 crc kubenswrapper[4837]: I0313 11:52:32.834624 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0484d991-f239-47a2-80ff-0237945c27ac-kube-api-access-dlwdw" (OuterVolumeSpecName: "kube-api-access-dlwdw") pod "0484d991-f239-47a2-80ff-0237945c27ac" (UID: "0484d991-f239-47a2-80ff-0237945c27ac"). InnerVolumeSpecName "kube-api-access-dlwdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:32 crc kubenswrapper[4837]: I0313 11:52:32.930055 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlwdw\" (UniqueName: \"kubernetes.io/projected/0484d991-f239-47a2-80ff-0237945c27ac-kube-api-access-dlwdw\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:33 crc kubenswrapper[4837]: I0313 11:52:33.337022 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556710-lcprh" event={"ID":"0484d991-f239-47a2-80ff-0237945c27ac","Type":"ContainerDied","Data":"960f7af1fa61c8ed012820a8878b593f9924c583dd0d3076ea82e4ba9452a14b"} Mar 13 11:52:33 crc kubenswrapper[4837]: I0313 11:52:33.337105 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="960f7af1fa61c8ed012820a8878b593f9924c583dd0d3076ea82e4ba9452a14b" Mar 13 11:52:33 crc kubenswrapper[4837]: I0313 11:52:33.337124 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556710-lcprh" Mar 13 11:52:35 crc kubenswrapper[4837]: I0313 11:52:35.483937 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:52:35 crc kubenswrapper[4837]: I0313 11:52:35.484411 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:52:35 crc kubenswrapper[4837]: I0313 11:52:35.484912 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:52:35 crc kubenswrapper[4837]: I0313 11:52:35.486086 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 11:52:35 crc kubenswrapper[4837]: I0313 11:52:35.486175 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a" gracePeriod=600 Mar 13 11:52:36 crc kubenswrapper[4837]: I0313 11:52:36.361355 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a" exitCode=0 Mar 13 11:52:36 crc kubenswrapper[4837]: I0313 11:52:36.361389 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a"} Mar 13 11:52:38 crc kubenswrapper[4837]: I0313 11:52:38.114932 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s"] Mar 13 11:52:38 crc kubenswrapper[4837]: I0313 11:52:38.116602 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" podUID="2f85724b-0c9e-4a01-927a-0054866f46d5" containerName="controller-manager" containerID="cri-o://a26cbf0bbee08d073239d4a9f9f827d33e581ba8c5b8a64d42ab65f1206a8910" gracePeriod=30 Mar 13 11:52:38 crc kubenswrapper[4837]: I0313 11:52:38.126247 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j"] Mar 13 11:52:38 crc kubenswrapper[4837]: I0313 11:52:38.126494 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" podUID="1c0f7bce-90d1-4fe8-a832-8c4f55efd886" containerName="route-controller-manager" containerID="cri-o://c5b04d576ec724fb239a6b87808e44a211c810b2a1f9473b6bf7e374d0f380ef" gracePeriod=30 Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.377448 4837 generic.go:334] "Generic (PLEG): container finished" podID="1c0f7bce-90d1-4fe8-a832-8c4f55efd886" containerID="c5b04d576ec724fb239a6b87808e44a211c810b2a1f9473b6bf7e374d0f380ef" exitCode=0 Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.377500 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" event={"ID":"1c0f7bce-90d1-4fe8-a832-8c4f55efd886","Type":"ContainerDied","Data":"c5b04d576ec724fb239a6b87808e44a211c810b2a1f9473b6bf7e374d0f380ef"} Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.378887 4837 generic.go:334] "Generic (PLEG): container finished" podID="2f85724b-0c9e-4a01-927a-0054866f46d5" containerID="a26cbf0bbee08d073239d4a9f9f827d33e581ba8c5b8a64d42ab65f1206a8910" exitCode=0 Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.378907 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" event={"ID":"2f85724b-0c9e-4a01-927a-0054866f46d5","Type":"ContainerDied","Data":"a26cbf0bbee08d073239d4a9f9f827d33e581ba8c5b8a64d42ab65f1206a8910"} Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.740762 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.795171 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7bb68b4479-msngs"] Mar 13 11:52:39 crc kubenswrapper[4837]: E0313 11:52:39.795455 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f85724b-0c9e-4a01-927a-0054866f46d5" containerName="controller-manager" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.795474 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f85724b-0c9e-4a01-927a-0054866f46d5" containerName="controller-manager" Mar 13 11:52:39 crc kubenswrapper[4837]: E0313 11:52:39.795488 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0484d991-f239-47a2-80ff-0237945c27ac" containerName="oc" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.795502 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0484d991-f239-47a2-80ff-0237945c27ac" containerName="oc" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.795677 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0484d991-f239-47a2-80ff-0237945c27ac" containerName="oc" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.795700 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f85724b-0c9e-4a01-927a-0054866f46d5" containerName="controller-manager" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.796132 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.803140 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bb68b4479-msngs"] Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.817855 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-proxy-ca-bundles\") pod \"2f85724b-0c9e-4a01-927a-0054866f46d5\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.817937 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-config\") pod \"2f85724b-0c9e-4a01-927a-0054866f46d5\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.819046 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2f85724b-0c9e-4a01-927a-0054866f46d5" (UID: "2f85724b-0c9e-4a01-927a-0054866f46d5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.819126 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f85724b-0c9e-4a01-927a-0054866f46d5-serving-cert\") pod \"2f85724b-0c9e-4a01-927a-0054866f46d5\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.819260 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-client-ca\") pod \"2f85724b-0c9e-4a01-927a-0054866f46d5\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.819345 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jdgb\" (UniqueName: \"kubernetes.io/projected/2f85724b-0c9e-4a01-927a-0054866f46d5-kube-api-access-6jdgb\") pod \"2f85724b-0c9e-4a01-927a-0054866f46d5\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.819766 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-client-ca\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.819867 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kndkg\" (UniqueName: \"kubernetes.io/projected/5b640fc3-2425-48b0-adfa-3300a6d52002-kube-api-access-kndkg\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.819872 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-config" (OuterVolumeSpecName: "config") pod "2f85724b-0c9e-4a01-927a-0054866f46d5" (UID: "2f85724b-0c9e-4a01-927a-0054866f46d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.819999 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-proxy-ca-bundles\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.820041 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-config\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.820099 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b640fc3-2425-48b0-adfa-3300a6d52002-serving-cert\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.820102 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-client-ca" (OuterVolumeSpecName: "client-ca") pod "2f85724b-0c9e-4a01-927a-0054866f46d5" (UID: "2f85724b-0c9e-4a01-927a-0054866f46d5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.820195 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.820264 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.834806 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f85724b-0c9e-4a01-927a-0054866f46d5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2f85724b-0c9e-4a01-927a-0054866f46d5" (UID: "2f85724b-0c9e-4a01-927a-0054866f46d5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.834829 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f85724b-0c9e-4a01-927a-0054866f46d5-kube-api-access-6jdgb" (OuterVolumeSpecName: "kube-api-access-6jdgb") pod "2f85724b-0c9e-4a01-927a-0054866f46d5" (UID: "2f85724b-0c9e-4a01-927a-0054866f46d5"). InnerVolumeSpecName "kube-api-access-6jdgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.921328 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kndkg\" (UniqueName: \"kubernetes.io/projected/5b640fc3-2425-48b0-adfa-3300a6d52002-kube-api-access-kndkg\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.921423 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-proxy-ca-bundles\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.921454 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-config\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.921496 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b640fc3-2425-48b0-adfa-3300a6d52002-serving-cert\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.921520 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-client-ca\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.921577 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.921590 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jdgb\" (UniqueName: \"kubernetes.io/projected/2f85724b-0c9e-4a01-927a-0054866f46d5-kube-api-access-6jdgb\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.921602 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f85724b-0c9e-4a01-927a-0054866f46d5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.922655 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-client-ca\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.923772 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-config\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.924096 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-proxy-ca-bundles\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.927143 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b640fc3-2425-48b0-adfa-3300a6d52002-serving-cert\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.941885 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kndkg\" (UniqueName: \"kubernetes.io/projected/5b640fc3-2425-48b0-adfa-3300a6d52002-kube-api-access-kndkg\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.113372 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.194999 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.225121 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-client-ca\") pod \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.225189 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-serving-cert\") pod \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.225242 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-config\") pod \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.225271 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqdrx\" (UniqueName: \"kubernetes.io/projected/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-kube-api-access-fqdrx\") pod \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.226417 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-client-ca" (OuterVolumeSpecName: "client-ca") pod "1c0f7bce-90d1-4fe8-a832-8c4f55efd886" (UID: "1c0f7bce-90d1-4fe8-a832-8c4f55efd886"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.226447 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-config" (OuterVolumeSpecName: "config") pod "1c0f7bce-90d1-4fe8-a832-8c4f55efd886" (UID: "1c0f7bce-90d1-4fe8-a832-8c4f55efd886"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.228607 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-kube-api-access-fqdrx" (OuterVolumeSpecName: "kube-api-access-fqdrx") pod "1c0f7bce-90d1-4fe8-a832-8c4f55efd886" (UID: "1c0f7bce-90d1-4fe8-a832-8c4f55efd886"). InnerVolumeSpecName "kube-api-access-fqdrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.228717 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1c0f7bce-90d1-4fe8-a832-8c4f55efd886" (UID: "1c0f7bce-90d1-4fe8-a832-8c4f55efd886"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.327606 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.327744 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.327765 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqdrx\" (UniqueName: \"kubernetes.io/projected/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-kube-api-access-fqdrx\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.327788 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.384835 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" event={"ID":"2f85724b-0c9e-4a01-927a-0054866f46d5","Type":"ContainerDied","Data":"a1d4f83811bf42ca7944c4333709cb3c7a2ea535fb4a2466c48aaccfc4a847ba"} Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.384860 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.384891 4837 scope.go:117] "RemoveContainer" containerID="a26cbf0bbee08d073239d4a9f9f827d33e581ba8c5b8a64d42ab65f1206a8910" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.387056 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" event={"ID":"1c0f7bce-90d1-4fe8-a832-8c4f55efd886","Type":"ContainerDied","Data":"9be860a81b1b95a82b5cdc8cfccb2c34c9083d28d8fc3fa8cb17545d34f6a331"} Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.387153 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.417206 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s"] Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.420242 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s"] Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.428804 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j"] Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.431240 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j"] Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.713997 4837 scope.go:117] "RemoveContainer" containerID="c5b04d576ec724fb239a6b87808e44a211c810b2a1f9473b6bf7e374d0f380ef" Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.055826 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c0f7bce-90d1-4fe8-a832-8c4f55efd886" path="/var/lib/kubelet/pods/1c0f7bce-90d1-4fe8-a832-8c4f55efd886/volumes" Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.060197 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f85724b-0c9e-4a01-927a-0054866f46d5" path="/var/lib/kubelet/pods/2f85724b-0c9e-4a01-927a-0054866f46d5/volumes" Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.272111 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bb68b4479-msngs"] Mar 13 11:52:41 crc kubenswrapper[4837]: W0313 11:52:41.312736 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b640fc3_2425_48b0_adfa_3300a6d52002.slice/crio-097c9cec1e11f8aa83b30bf8b029f844e3b8d7cc463f16a2a3f8fa54330c0b4a WatchSource:0}: Error finding container 097c9cec1e11f8aa83b30bf8b029f844e3b8d7cc463f16a2a3f8fa54330c0b4a: Status 404 returned error can't find the container with id 097c9cec1e11f8aa83b30bf8b029f844e3b8d7cc463f16a2a3f8fa54330c0b4a Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.414204 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7crb6" event={"ID":"080747b0-3d43-4ff1-b21c-b8ea9fc2f961","Type":"ContainerStarted","Data":"0d9d2068fa9a75fcedf62135c026dbc7b6be8fecbfe8ba1e1bd893e7874fe650"} Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.422005 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft6cr" event={"ID":"e6060cf2-077e-4112-af57-f100e297f320","Type":"ContainerStarted","Data":"18157b9c0c2686c96644926d7ef6b55f8879075870dcef9cd6f8b2f09be008ae"} Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.430540 4837 generic.go:334] "Generic (PLEG): container finished" podID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" containerID="2b9a329e446509859ed3d2441ed61db997519085747ff5a45b3c942e5be127c3" exitCode=0 Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.430688 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jspgm" event={"ID":"5236ae0e-b305-4f1c-9125-bbac1eeb07f3","Type":"ContainerDied","Data":"2b9a329e446509859ed3d2441ed61db997519085747ff5a45b3c942e5be127c3"} Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.446123 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j246z" event={"ID":"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6","Type":"ContainerStarted","Data":"14abcb9ea1fbbb60b399b29e871faa61fa90553e6b8ac0c1201e048e766e55b2"} Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.448533 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng6kk" event={"ID":"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d","Type":"ContainerStarted","Data":"f0462ba3a7de7e503c15d1b89ef700574f3f475b90e578e1c36542ba0d37ed51"} Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.458698 4837 generic.go:334] "Generic (PLEG): container finished" podID="45e6ae52-59ef-446f-917a-549d34ffbf8e" containerID="3eeb1bee61050e1f32f457f88245564bdfbb228b2e770616db2d75ef4e55866a" exitCode=0 Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.458779 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vx4r8" event={"ID":"45e6ae52-59ef-446f-917a-549d34ffbf8e","Type":"ContainerDied","Data":"3eeb1bee61050e1f32f457f88245564bdfbb228b2e770616db2d75ef4e55866a"} Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.470795 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" event={"ID":"5b640fc3-2425-48b0-adfa-3300a6d52002","Type":"ContainerStarted","Data":"097c9cec1e11f8aa83b30bf8b029f844e3b8d7cc463f16a2a3f8fa54330c0b4a"} Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.473086 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twtbj" event={"ID":"278c91cc-2624-42cd-a35e-287e22d22f7d","Type":"ContainerStarted","Data":"a7440d1435344c8bb57482e03b8adcb5ddb52bcac5f7b4d1ca42e60d0e6e91e3"} Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.483257 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"ea590165224c827e615cf9230078895eabcfa03489c1d34d92662f043fe58752"} Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.486729 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5tnrx" event={"ID":"6870caea-07d6-4465-86b1-645a2e29b240","Type":"ContainerStarted","Data":"e7b7bdfa9636e908ca85620d4aa821fe24093ede2d0e43b39c562069b0b2da62"} Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.595207 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j246z" podStartSLOduration=8.203274834 podStartE2EDuration="1m0.595013537s" podCreationTimestamp="2026-03-13 11:51:41 +0000 UTC" firstStartedPulling="2026-03-13 11:51:43.586525391 +0000 UTC m=+219.224792154" lastFinishedPulling="2026-03-13 11:52:35.978264094 +0000 UTC m=+271.616530857" observedRunningTime="2026-03-13 11:52:41.567901267 +0000 UTC m=+277.206168031" watchObservedRunningTime="2026-03-13 11:52:41.595013537 +0000 UTC m=+277.233280300" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.086129 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.086437 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.494012 4837 generic.go:334] "Generic (PLEG): container finished" podID="e6060cf2-077e-4112-af57-f100e297f320" containerID="18157b9c0c2686c96644926d7ef6b55f8879075870dcef9cd6f8b2f09be008ae" exitCode=0 Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.494073 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft6cr" event={"ID":"e6060cf2-077e-4112-af57-f100e297f320","Type":"ContainerDied","Data":"18157b9c0c2686c96644926d7ef6b55f8879075870dcef9cd6f8b2f09be008ae"} Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.496725 4837 generic.go:334] "Generic (PLEG): container finished" podID="278c91cc-2624-42cd-a35e-287e22d22f7d" containerID="a7440d1435344c8bb57482e03b8adcb5ddb52bcac5f7b4d1ca42e60d0e6e91e3" exitCode=0 Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.497367 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twtbj" event={"ID":"278c91cc-2624-42cd-a35e-287e22d22f7d","Type":"ContainerDied","Data":"a7440d1435344c8bb57482e03b8adcb5ddb52bcac5f7b4d1ca42e60d0e6e91e3"} Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.498922 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jspgm" event={"ID":"5236ae0e-b305-4f1c-9125-bbac1eeb07f3","Type":"ContainerStarted","Data":"685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e"} Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.500909 4837 generic.go:334] "Generic (PLEG): container finished" podID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" containerID="f0462ba3a7de7e503c15d1b89ef700574f3f475b90e578e1c36542ba0d37ed51" exitCode=0 Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.500949 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng6kk" event={"ID":"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d","Type":"ContainerDied","Data":"f0462ba3a7de7e503c15d1b89ef700574f3f475b90e578e1c36542ba0d37ed51"} Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.506145 4837 generic.go:334] "Generic (PLEG): container finished" podID="6870caea-07d6-4465-86b1-645a2e29b240" containerID="e7b7bdfa9636e908ca85620d4aa821fe24093ede2d0e43b39c562069b0b2da62" exitCode=0 Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.506226 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5tnrx" event={"ID":"6870caea-07d6-4465-86b1-645a2e29b240","Type":"ContainerDied","Data":"e7b7bdfa9636e908ca85620d4aa821fe24093ede2d0e43b39c562069b0b2da62"} Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.507710 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" event={"ID":"5b640fc3-2425-48b0-adfa-3300a6d52002","Type":"ContainerStarted","Data":"a06e6485ac9f8f31411ee3ba603ad966da0a6cdc4ea8a84b4df845f32f9ff0f0"} Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.508599 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.511666 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vx4r8" event={"ID":"45e6ae52-59ef-446f-917a-549d34ffbf8e","Type":"ContainerStarted","Data":"437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8"} Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.514912 4837 generic.go:334] "Generic (PLEG): container finished" podID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" containerID="0d9d2068fa9a75fcedf62135c026dbc7b6be8fecbfe8ba1e1bd893e7874fe650" exitCode=0 Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.515160 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7crb6" event={"ID":"080747b0-3d43-4ff1-b21c-b8ea9fc2f961","Type":"ContainerDied","Data":"0d9d2068fa9a75fcedf62135c026dbc7b6be8fecbfe8ba1e1bd893e7874fe650"} Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.516038 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.544107 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb"] Mar 13 11:52:42 crc kubenswrapper[4837]: E0313 11:52:42.544989 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c0f7bce-90d1-4fe8-a832-8c4f55efd886" containerName="route-controller-manager" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.545012 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c0f7bce-90d1-4fe8-a832-8c4f55efd886" containerName="route-controller-manager" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.549187 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c0f7bce-90d1-4fe8-a832-8c4f55efd886" containerName="route-controller-manager" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.549995 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.553305 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" podStartSLOduration=4.553278738 podStartE2EDuration="4.553278738s" podCreationTimestamp="2026-03-13 11:52:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:52:42.543523192 +0000 UTC m=+278.181789955" watchObservedRunningTime="2026-03-13 11:52:42.553278738 +0000 UTC m=+278.191545501" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.557957 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.558606 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.558910 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.559142 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.559463 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.559783 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.588470 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb"] Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.592138 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vx4r8" podStartSLOduration=2.95879682 podStartE2EDuration="1m4.592118957s" podCreationTimestamp="2026-03-13 11:51:38 +0000 UTC" firstStartedPulling="2026-03-13 11:51:40.268777084 +0000 UTC m=+215.907043847" lastFinishedPulling="2026-03-13 11:52:41.902099221 +0000 UTC m=+277.540365984" observedRunningTime="2026-03-13 11:52:42.573662119 +0000 UTC m=+278.211928882" watchObservedRunningTime="2026-03-13 11:52:42.592118957 +0000 UTC m=+278.230385730" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.662462 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6fdbbb9-292f-4621-892a-53a6c1c13f65-config\") pod \"route-controller-manager-6764db5ddf-mqtsb\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.662524 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6fdbbb9-292f-4621-892a-53a6c1c13f65-serving-cert\") pod \"route-controller-manager-6764db5ddf-mqtsb\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.662553 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6fdbbb9-292f-4621-892a-53a6c1c13f65-client-ca\") pod \"route-controller-manager-6764db5ddf-mqtsb\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.662606 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rft5b\" (UniqueName: \"kubernetes.io/projected/e6fdbbb9-292f-4621-892a-53a6c1c13f65-kube-api-access-rft5b\") pod \"route-controller-manager-6764db5ddf-mqtsb\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.669233 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jspgm" podStartSLOduration=3.315294886 podStartE2EDuration="1m2.669217516s" podCreationTimestamp="2026-03-13 11:51:40 +0000 UTC" firstStartedPulling="2026-03-13 11:51:42.505912501 +0000 UTC m=+218.144179264" lastFinishedPulling="2026-03-13 11:52:41.859835131 +0000 UTC m=+277.498101894" observedRunningTime="2026-03-13 11:52:42.649862969 +0000 UTC m=+278.288129732" watchObservedRunningTime="2026-03-13 11:52:42.669217516 +0000 UTC m=+278.307484279" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.764229 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6fdbbb9-292f-4621-892a-53a6c1c13f65-config\") pod \"route-controller-manager-6764db5ddf-mqtsb\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.764556 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6fdbbb9-292f-4621-892a-53a6c1c13f65-serving-cert\") pod \"route-controller-manager-6764db5ddf-mqtsb\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.764665 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6fdbbb9-292f-4621-892a-53a6c1c13f65-client-ca\") pod \"route-controller-manager-6764db5ddf-mqtsb\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.764772 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rft5b\" (UniqueName: \"kubernetes.io/projected/e6fdbbb9-292f-4621-892a-53a6c1c13f65-kube-api-access-rft5b\") pod \"route-controller-manager-6764db5ddf-mqtsb\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.766219 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6fdbbb9-292f-4621-892a-53a6c1c13f65-config\") pod \"route-controller-manager-6764db5ddf-mqtsb\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.767750 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6fdbbb9-292f-4621-892a-53a6c1c13f65-client-ca\") pod \"route-controller-manager-6764db5ddf-mqtsb\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.780079 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6fdbbb9-292f-4621-892a-53a6c1c13f65-serving-cert\") pod \"route-controller-manager-6764db5ddf-mqtsb\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.795390 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rft5b\" (UniqueName: \"kubernetes.io/projected/e6fdbbb9-292f-4621-892a-53a6c1c13f65-kube-api-access-rft5b\") pod \"route-controller-manager-6764db5ddf-mqtsb\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.887181 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:43 crc kubenswrapper[4837]: I0313 11:52:43.228899 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j246z" podUID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" containerName="registry-server" probeResult="failure" output=< Mar 13 11:52:43 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Mar 13 11:52:43 crc kubenswrapper[4837]: > Mar 13 11:52:43 crc kubenswrapper[4837]: I0313 11:52:43.318094 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb"] Mar 13 11:52:43 crc kubenswrapper[4837]: W0313 11:52:43.324121 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6fdbbb9_292f_4621_892a_53a6c1c13f65.slice/crio-ec6669f507f7c1917535f69f1388c95e2e8ec84b237ee008635f694dc17102e4 WatchSource:0}: Error finding container ec6669f507f7c1917535f69f1388c95e2e8ec84b237ee008635f694dc17102e4: Status 404 returned error can't find the container with id ec6669f507f7c1917535f69f1388c95e2e8ec84b237ee008635f694dc17102e4 Mar 13 11:52:43 crc kubenswrapper[4837]: I0313 11:52:43.522102 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5tnrx" event={"ID":"6870caea-07d6-4465-86b1-645a2e29b240","Type":"ContainerStarted","Data":"da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3"} Mar 13 11:52:43 crc kubenswrapper[4837]: I0313 11:52:43.525157 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7crb6" event={"ID":"080747b0-3d43-4ff1-b21c-b8ea9fc2f961","Type":"ContainerStarted","Data":"caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa"} Mar 13 11:52:43 crc kubenswrapper[4837]: I0313 11:52:43.526342 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" event={"ID":"e6fdbbb9-292f-4621-892a-53a6c1c13f65","Type":"ContainerStarted","Data":"ec6669f507f7c1917535f69f1388c95e2e8ec84b237ee008635f694dc17102e4"} Mar 13 11:52:43 crc kubenswrapper[4837]: I0313 11:52:43.538685 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5tnrx" podStartSLOduration=2.928566741 podStartE2EDuration="1m5.538664688s" podCreationTimestamp="2026-03-13 11:51:38 +0000 UTC" firstStartedPulling="2026-03-13 11:51:40.303557463 +0000 UTC m=+215.941824226" lastFinishedPulling="2026-03-13 11:52:42.91365541 +0000 UTC m=+278.551922173" observedRunningTime="2026-03-13 11:52:43.537395208 +0000 UTC m=+279.175661991" watchObservedRunningTime="2026-03-13 11:52:43.538664688 +0000 UTC m=+279.176931451" Mar 13 11:52:44 crc kubenswrapper[4837]: I0313 11:52:44.534036 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft6cr" event={"ID":"e6060cf2-077e-4112-af57-f100e297f320","Type":"ContainerStarted","Data":"981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1"} Mar 13 11:52:44 crc kubenswrapper[4837]: I0313 11:52:44.536859 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twtbj" event={"ID":"278c91cc-2624-42cd-a35e-287e22d22f7d","Type":"ContainerStarted","Data":"e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c"} Mar 13 11:52:44 crc kubenswrapper[4837]: I0313 11:52:44.539490 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng6kk" event={"ID":"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d","Type":"ContainerStarted","Data":"1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0"} Mar 13 11:52:44 crc kubenswrapper[4837]: I0313 11:52:44.541059 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" event={"ID":"e6fdbbb9-292f-4621-892a-53a6c1c13f65","Type":"ContainerStarted","Data":"7fdcd49d58a893ad2332b8993c2a1cdba6ab455eb48a11b6f32d6c2c777e93c3"} Mar 13 11:52:44 crc kubenswrapper[4837]: I0313 11:52:44.562344 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7crb6" podStartSLOduration=2.913330798 podStartE2EDuration="1m4.56232436s" podCreationTimestamp="2026-03-13 11:51:40 +0000 UTC" firstStartedPulling="2026-03-13 11:51:41.349855809 +0000 UTC m=+216.988122582" lastFinishedPulling="2026-03-13 11:52:42.998849381 +0000 UTC m=+278.637116144" observedRunningTime="2026-03-13 11:52:43.558889654 +0000 UTC m=+279.197156427" watchObservedRunningTime="2026-03-13 11:52:44.56232436 +0000 UTC m=+280.200591123" Mar 13 11:52:44 crc kubenswrapper[4837]: I0313 11:52:44.580421 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" podStartSLOduration=6.580400795 podStartE2EDuration="6.580400795s" podCreationTimestamp="2026-03-13 11:52:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:52:44.580360554 +0000 UTC m=+280.218627317" watchObservedRunningTime="2026-03-13 11:52:44.580400795 +0000 UTC m=+280.218667558" Mar 13 11:52:44 crc kubenswrapper[4837]: I0313 11:52:44.583715 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ft6cr" podStartSLOduration=3.814979068 podStartE2EDuration="1m6.583697552s" podCreationTimestamp="2026-03-13 11:51:38 +0000 UTC" firstStartedPulling="2026-03-13 11:51:40.27279963 +0000 UTC m=+215.911066393" lastFinishedPulling="2026-03-13 11:52:43.041518114 +0000 UTC m=+278.679784877" observedRunningTime="2026-03-13 11:52:44.562216976 +0000 UTC m=+280.200483749" watchObservedRunningTime="2026-03-13 11:52:44.583697552 +0000 UTC m=+280.221964315" Mar 13 11:52:44 crc kubenswrapper[4837]: I0313 11:52:44.602303 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-twtbj" podStartSLOduration=4.751976723 podStartE2EDuration="1m7.602283484s" podCreationTimestamp="2026-03-13 11:51:37 +0000 UTC" firstStartedPulling="2026-03-13 11:51:40.293287261 +0000 UTC m=+215.931554034" lastFinishedPulling="2026-03-13 11:52:43.143594032 +0000 UTC m=+278.781860795" observedRunningTime="2026-03-13 11:52:44.599114243 +0000 UTC m=+280.237381006" watchObservedRunningTime="2026-03-13 11:52:44.602283484 +0000 UTC m=+280.240550247" Mar 13 11:52:45 crc kubenswrapper[4837]: I0313 11:52:45.552497 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:45 crc kubenswrapper[4837]: I0313 11:52:45.560570 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:45 crc kubenswrapper[4837]: I0313 11:52:45.580854 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ng6kk" podStartSLOduration=5.03349287 podStartE2EDuration="1m4.580826804s" podCreationTimestamp="2026-03-13 11:51:41 +0000 UTC" firstStartedPulling="2026-03-13 11:51:43.602827571 +0000 UTC m=+219.241094334" lastFinishedPulling="2026-03-13 11:52:43.150161505 +0000 UTC m=+278.788428268" observedRunningTime="2026-03-13 11:52:44.633510007 +0000 UTC m=+280.271776770" watchObservedRunningTime="2026-03-13 11:52:45.580826804 +0000 UTC m=+281.219093567" Mar 13 11:52:48 crc kubenswrapper[4837]: I0313 11:52:48.320877 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:52:48 crc kubenswrapper[4837]: I0313 11:52:48.322573 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:52:48 crc kubenswrapper[4837]: I0313 11:52:48.374670 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:52:48 crc kubenswrapper[4837]: I0313 11:52:48.545290 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:52:48 crc kubenswrapper[4837]: I0313 11:52:48.546850 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:52:48 crc kubenswrapper[4837]: I0313 11:52:48.588380 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:52:48 crc kubenswrapper[4837]: I0313 11:52:48.610078 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:52:48 crc kubenswrapper[4837]: I0313 11:52:48.717696 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:52:48 crc kubenswrapper[4837]: I0313 11:52:48.717756 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:52:48 crc kubenswrapper[4837]: I0313 11:52:48.754047 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:52:48 crc kubenswrapper[4837]: I0313 11:52:48.978344 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:52:48 crc kubenswrapper[4837]: I0313 11:52:48.978400 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:52:49 crc kubenswrapper[4837]: I0313 11:52:49.034551 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:52:49 crc kubenswrapper[4837]: I0313 11:52:49.616936 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:52:49 crc kubenswrapper[4837]: I0313 11:52:49.619048 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:52:49 crc kubenswrapper[4837]: I0313 11:52:49.632603 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:52:50 crc kubenswrapper[4837]: I0313 11:52:50.506198 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:52:50 crc kubenswrapper[4837]: I0313 11:52:50.506483 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:52:50 crc kubenswrapper[4837]: I0313 11:52:50.575600 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:52:50 crc kubenswrapper[4837]: I0313 11:52:50.713447 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:52:50 crc kubenswrapper[4837]: I0313 11:52:50.880720 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vx4r8"] Mar 13 11:52:50 crc kubenswrapper[4837]: I0313 11:52:50.894504 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:52:50 crc kubenswrapper[4837]: I0313 11:52:50.894577 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:52:50 crc kubenswrapper[4837]: I0313 11:52:50.952592 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:52:51 crc kubenswrapper[4837]: I0313 11:52:51.588984 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:52:51 crc kubenswrapper[4837]: I0313 11:52:51.589141 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vx4r8" podUID="45e6ae52-59ef-446f-917a-549d34ffbf8e" containerName="registry-server" containerID="cri-o://437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8" gracePeriod=2 Mar 13 11:52:51 crc kubenswrapper[4837]: I0313 11:52:51.589188 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:52:51 crc kubenswrapper[4837]: I0313 11:52:51.631724 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:52:51 crc kubenswrapper[4837]: I0313 11:52:51.662331 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:52:51 crc kubenswrapper[4837]: I0313 11:52:51.882131 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5tnrx"] Mar 13 11:52:51 crc kubenswrapper[4837]: I0313 11:52:51.882357 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5tnrx" podUID="6870caea-07d6-4465-86b1-645a2e29b240" containerName="registry-server" containerID="cri-o://da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3" gracePeriod=2 Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.060628 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.088549 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmfv2\" (UniqueName: \"kubernetes.io/projected/45e6ae52-59ef-446f-917a-549d34ffbf8e-kube-api-access-xmfv2\") pod \"45e6ae52-59ef-446f-917a-549d34ffbf8e\" (UID: \"45e6ae52-59ef-446f-917a-549d34ffbf8e\") " Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.088615 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45e6ae52-59ef-446f-917a-549d34ffbf8e-catalog-content\") pod \"45e6ae52-59ef-446f-917a-549d34ffbf8e\" (UID: \"45e6ae52-59ef-446f-917a-549d34ffbf8e\") " Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.088702 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45e6ae52-59ef-446f-917a-549d34ffbf8e-utilities\") pod \"45e6ae52-59ef-446f-917a-549d34ffbf8e\" (UID: \"45e6ae52-59ef-446f-917a-549d34ffbf8e\") " Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.091152 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45e6ae52-59ef-446f-917a-549d34ffbf8e-utilities" (OuterVolumeSpecName: "utilities") pod "45e6ae52-59ef-446f-917a-549d34ffbf8e" (UID: "45e6ae52-59ef-446f-917a-549d34ffbf8e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.095814 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45e6ae52-59ef-446f-917a-549d34ffbf8e-kube-api-access-xmfv2" (OuterVolumeSpecName: "kube-api-access-xmfv2") pod "45e6ae52-59ef-446f-917a-549d34ffbf8e" (UID: "45e6ae52-59ef-446f-917a-549d34ffbf8e"). InnerVolumeSpecName "kube-api-access-xmfv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.132199 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.152342 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45e6ae52-59ef-446f-917a-549d34ffbf8e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45e6ae52-59ef-446f-917a-549d34ffbf8e" (UID: "45e6ae52-59ef-446f-917a-549d34ffbf8e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.167308 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.190465 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmfv2\" (UniqueName: \"kubernetes.io/projected/45e6ae52-59ef-446f-917a-549d34ffbf8e-kube-api-access-xmfv2\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.190508 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45e6ae52-59ef-446f-917a-549d34ffbf8e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.190598 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45e6ae52-59ef-446f-917a-549d34ffbf8e-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.487871 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.596919 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rpl9\" (UniqueName: \"kubernetes.io/projected/6870caea-07d6-4465-86b1-645a2e29b240-kube-api-access-4rpl9\") pod \"6870caea-07d6-4465-86b1-645a2e29b240\" (UID: \"6870caea-07d6-4465-86b1-645a2e29b240\") " Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.596993 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6870caea-07d6-4465-86b1-645a2e29b240-catalog-content\") pod \"6870caea-07d6-4465-86b1-645a2e29b240\" (UID: \"6870caea-07d6-4465-86b1-645a2e29b240\") " Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.597078 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6870caea-07d6-4465-86b1-645a2e29b240-utilities\") pod \"6870caea-07d6-4465-86b1-645a2e29b240\" (UID: \"6870caea-07d6-4465-86b1-645a2e29b240\") " Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.598165 4837 generic.go:334] "Generic (PLEG): container finished" podID="6870caea-07d6-4465-86b1-645a2e29b240" containerID="da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3" exitCode=0 Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.598246 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5tnrx" event={"ID":"6870caea-07d6-4465-86b1-645a2e29b240","Type":"ContainerDied","Data":"da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3"} Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.598282 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5tnrx" event={"ID":"6870caea-07d6-4465-86b1-645a2e29b240","Type":"ContainerDied","Data":"bfe6aca1334934677df8bf272b7d6fdeb1c785b92dcc8ef7c0566c6636ddfaa3"} Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.598309 4837 scope.go:117] "RemoveContainer" containerID="da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.598338 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6870caea-07d6-4465-86b1-645a2e29b240-utilities" (OuterVolumeSpecName: "utilities") pod "6870caea-07d6-4465-86b1-645a2e29b240" (UID: "6870caea-07d6-4465-86b1-645a2e29b240"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.598480 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.601385 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6870caea-07d6-4465-86b1-645a2e29b240-kube-api-access-4rpl9" (OuterVolumeSpecName: "kube-api-access-4rpl9") pod "6870caea-07d6-4465-86b1-645a2e29b240" (UID: "6870caea-07d6-4465-86b1-645a2e29b240"). InnerVolumeSpecName "kube-api-access-4rpl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.603358 4837 generic.go:334] "Generic (PLEG): container finished" podID="45e6ae52-59ef-446f-917a-549d34ffbf8e" containerID="437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8" exitCode=0 Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.603421 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.603451 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vx4r8" event={"ID":"45e6ae52-59ef-446f-917a-549d34ffbf8e","Type":"ContainerDied","Data":"437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8"} Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.603526 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vx4r8" event={"ID":"45e6ae52-59ef-446f-917a-549d34ffbf8e","Type":"ContainerDied","Data":"b8c38b609b1ee957c7e1e1a563341d86aa7368639c49a74a0e6c541c1d320168"} Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.625746 4837 scope.go:117] "RemoveContainer" containerID="e7b7bdfa9636e908ca85620d4aa821fe24093ede2d0e43b39c562069b0b2da62" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.645280 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vx4r8"] Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.648767 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vx4r8"] Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.652904 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.661477 4837 scope.go:117] "RemoveContainer" containerID="ce94a39ad6afbdcb9bafffd4ef0157cee1cf3107185ab6fcc83d0813bc89a94d" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.680090 4837 scope.go:117] "RemoveContainer" containerID="da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3" Mar 13 11:52:52 crc kubenswrapper[4837]: E0313 11:52:52.680654 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3\": container with ID starting with da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3 not found: ID does not exist" containerID="da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.680697 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3"} err="failed to get container status \"da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3\": rpc error: code = NotFound desc = could not find container \"da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3\": container with ID starting with da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3 not found: ID does not exist" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.680727 4837 scope.go:117] "RemoveContainer" containerID="e7b7bdfa9636e908ca85620d4aa821fe24093ede2d0e43b39c562069b0b2da62" Mar 13 11:52:52 crc kubenswrapper[4837]: E0313 11:52:52.681068 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7b7bdfa9636e908ca85620d4aa821fe24093ede2d0e43b39c562069b0b2da62\": container with ID starting with e7b7bdfa9636e908ca85620d4aa821fe24093ede2d0e43b39c562069b0b2da62 not found: ID does not exist" containerID="e7b7bdfa9636e908ca85620d4aa821fe24093ede2d0e43b39c562069b0b2da62" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.681126 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7b7bdfa9636e908ca85620d4aa821fe24093ede2d0e43b39c562069b0b2da62"} err="failed to get container status \"e7b7bdfa9636e908ca85620d4aa821fe24093ede2d0e43b39c562069b0b2da62\": rpc error: code = NotFound desc = could not find container \"e7b7bdfa9636e908ca85620d4aa821fe24093ede2d0e43b39c562069b0b2da62\": container with ID starting with e7b7bdfa9636e908ca85620d4aa821fe24093ede2d0e43b39c562069b0b2da62 not found: ID does not exist" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.681160 4837 scope.go:117] "RemoveContainer" containerID="ce94a39ad6afbdcb9bafffd4ef0157cee1cf3107185ab6fcc83d0813bc89a94d" Mar 13 11:52:52 crc kubenswrapper[4837]: E0313 11:52:52.681581 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce94a39ad6afbdcb9bafffd4ef0157cee1cf3107185ab6fcc83d0813bc89a94d\": container with ID starting with ce94a39ad6afbdcb9bafffd4ef0157cee1cf3107185ab6fcc83d0813bc89a94d not found: ID does not exist" containerID="ce94a39ad6afbdcb9bafffd4ef0157cee1cf3107185ab6fcc83d0813bc89a94d" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.681656 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce94a39ad6afbdcb9bafffd4ef0157cee1cf3107185ab6fcc83d0813bc89a94d"} err="failed to get container status \"ce94a39ad6afbdcb9bafffd4ef0157cee1cf3107185ab6fcc83d0813bc89a94d\": rpc error: code = NotFound desc = could not find container \"ce94a39ad6afbdcb9bafffd4ef0157cee1cf3107185ab6fcc83d0813bc89a94d\": container with ID starting with ce94a39ad6afbdcb9bafffd4ef0157cee1cf3107185ab6fcc83d0813bc89a94d not found: ID does not exist" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.681677 4837 scope.go:117] "RemoveContainer" containerID="437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.699366 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6870caea-07d6-4465-86b1-645a2e29b240-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6870caea-07d6-4465-86b1-645a2e29b240" (UID: "6870caea-07d6-4465-86b1-645a2e29b240"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.702391 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rpl9\" (UniqueName: \"kubernetes.io/projected/6870caea-07d6-4465-86b1-645a2e29b240-kube-api-access-4rpl9\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.702778 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6870caea-07d6-4465-86b1-645a2e29b240-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.702989 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6870caea-07d6-4465-86b1-645a2e29b240-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.703887 4837 scope.go:117] "RemoveContainer" containerID="3eeb1bee61050e1f32f457f88245564bdfbb228b2e770616db2d75ef4e55866a" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.725602 4837 scope.go:117] "RemoveContainer" containerID="29b7adb5a9c0b54134cefd4b865e773828049385da6ba275d2791faad9875780" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.739122 4837 scope.go:117] "RemoveContainer" containerID="437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8" Mar 13 11:52:52 crc kubenswrapper[4837]: E0313 11:52:52.739796 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8\": container with ID starting with 437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8 not found: ID does not exist" containerID="437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.739843 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8"} err="failed to get container status \"437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8\": rpc error: code = NotFound desc = could not find container \"437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8\": container with ID starting with 437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8 not found: ID does not exist" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.739870 4837 scope.go:117] "RemoveContainer" containerID="3eeb1bee61050e1f32f457f88245564bdfbb228b2e770616db2d75ef4e55866a" Mar 13 11:52:52 crc kubenswrapper[4837]: E0313 11:52:52.740137 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eeb1bee61050e1f32f457f88245564bdfbb228b2e770616db2d75ef4e55866a\": container with ID starting with 3eeb1bee61050e1f32f457f88245564bdfbb228b2e770616db2d75ef4e55866a not found: ID does not exist" containerID="3eeb1bee61050e1f32f457f88245564bdfbb228b2e770616db2d75ef4e55866a" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.740167 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eeb1bee61050e1f32f457f88245564bdfbb228b2e770616db2d75ef4e55866a"} err="failed to get container status \"3eeb1bee61050e1f32f457f88245564bdfbb228b2e770616db2d75ef4e55866a\": rpc error: code = NotFound desc = could not find container \"3eeb1bee61050e1f32f457f88245564bdfbb228b2e770616db2d75ef4e55866a\": container with ID starting with 3eeb1bee61050e1f32f457f88245564bdfbb228b2e770616db2d75ef4e55866a not found: ID does not exist" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.740187 4837 scope.go:117] "RemoveContainer" containerID="29b7adb5a9c0b54134cefd4b865e773828049385da6ba275d2791faad9875780" Mar 13 11:52:52 crc kubenswrapper[4837]: E0313 11:52:52.740411 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29b7adb5a9c0b54134cefd4b865e773828049385da6ba275d2791faad9875780\": container with ID starting with 29b7adb5a9c0b54134cefd4b865e773828049385da6ba275d2791faad9875780 not found: ID does not exist" containerID="29b7adb5a9c0b54134cefd4b865e773828049385da6ba275d2791faad9875780" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.740451 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29b7adb5a9c0b54134cefd4b865e773828049385da6ba275d2791faad9875780"} err="failed to get container status \"29b7adb5a9c0b54134cefd4b865e773828049385da6ba275d2791faad9875780\": rpc error: code = NotFound desc = could not find container \"29b7adb5a9c0b54134cefd4b865e773828049385da6ba275d2791faad9875780\": container with ID starting with 29b7adb5a9c0b54134cefd4b865e773828049385da6ba275d2791faad9875780 not found: ID does not exist" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.928018 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5tnrx"] Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.931540 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5tnrx"] Mar 13 11:52:53 crc kubenswrapper[4837]: I0313 11:52:53.055536 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45e6ae52-59ef-446f-917a-549d34ffbf8e" path="/var/lib/kubelet/pods/45e6ae52-59ef-446f-917a-549d34ffbf8e/volumes" Mar 13 11:52:53 crc kubenswrapper[4837]: I0313 11:52:53.056208 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6870caea-07d6-4465-86b1-645a2e29b240" path="/var/lib/kubelet/pods/6870caea-07d6-4465-86b1-645a2e29b240/volumes" Mar 13 11:52:53 crc kubenswrapper[4837]: I0313 11:52:53.282744 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jspgm"] Mar 13 11:52:53 crc kubenswrapper[4837]: I0313 11:52:53.609254 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jspgm" podUID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" containerName="registry-server" containerID="cri-o://685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e" gracePeriod=2 Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.088272 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.121828 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-utilities\") pod \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\" (UID: \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\") " Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.121958 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-catalog-content\") pod \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\" (UID: \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\") " Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.122071 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l42fc\" (UniqueName: \"kubernetes.io/projected/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-kube-api-access-l42fc\") pod \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\" (UID: \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\") " Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.122973 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-utilities" (OuterVolumeSpecName: "utilities") pod "5236ae0e-b305-4f1c-9125-bbac1eeb07f3" (UID: "5236ae0e-b305-4f1c-9125-bbac1eeb07f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.123905 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.126595 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-kube-api-access-l42fc" (OuterVolumeSpecName: "kube-api-access-l42fc") pod "5236ae0e-b305-4f1c-9125-bbac1eeb07f3" (UID: "5236ae0e-b305-4f1c-9125-bbac1eeb07f3"). InnerVolumeSpecName "kube-api-access-l42fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.148310 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5236ae0e-b305-4f1c-9125-bbac1eeb07f3" (UID: "5236ae0e-b305-4f1c-9125-bbac1eeb07f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.225224 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.225268 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l42fc\" (UniqueName: \"kubernetes.io/projected/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-kube-api-access-l42fc\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.618619 4837 generic.go:334] "Generic (PLEG): container finished" podID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" containerID="685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e" exitCode=0 Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.618705 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.618782 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jspgm" event={"ID":"5236ae0e-b305-4f1c-9125-bbac1eeb07f3","Type":"ContainerDied","Data":"685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e"} Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.618933 4837 scope.go:117] "RemoveContainer" containerID="685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.619213 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jspgm" event={"ID":"5236ae0e-b305-4f1c-9125-bbac1eeb07f3","Type":"ContainerDied","Data":"307f294c9c816d0f8c581cbf3561f2a5e0cff01395517438e2ad320ce61f35e4"} Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.646100 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jspgm"] Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.649701 4837 scope.go:117] "RemoveContainer" containerID="2b9a329e446509859ed3d2441ed61db997519085747ff5a45b3c942e5be127c3" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.650227 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jspgm"] Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.675299 4837 scope.go:117] "RemoveContainer" containerID="23cf1d25c2a824231d5fd39eb5a9920394e587e257254f6ffa6bf893fe9f2624" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.689455 4837 scope.go:117] "RemoveContainer" containerID="685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e" Mar 13 11:52:54 crc kubenswrapper[4837]: E0313 11:52:54.689858 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e\": container with ID starting with 685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e not found: ID does not exist" containerID="685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.689899 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e"} err="failed to get container status \"685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e\": rpc error: code = NotFound desc = could not find container \"685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e\": container with ID starting with 685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e not found: ID does not exist" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.689925 4837 scope.go:117] "RemoveContainer" containerID="2b9a329e446509859ed3d2441ed61db997519085747ff5a45b3c942e5be127c3" Mar 13 11:52:54 crc kubenswrapper[4837]: E0313 11:52:54.690311 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b9a329e446509859ed3d2441ed61db997519085747ff5a45b3c942e5be127c3\": container with ID starting with 2b9a329e446509859ed3d2441ed61db997519085747ff5a45b3c942e5be127c3 not found: ID does not exist" containerID="2b9a329e446509859ed3d2441ed61db997519085747ff5a45b3c942e5be127c3" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.690358 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b9a329e446509859ed3d2441ed61db997519085747ff5a45b3c942e5be127c3"} err="failed to get container status \"2b9a329e446509859ed3d2441ed61db997519085747ff5a45b3c942e5be127c3\": rpc error: code = NotFound desc = could not find container \"2b9a329e446509859ed3d2441ed61db997519085747ff5a45b3c942e5be127c3\": container with ID starting with 2b9a329e446509859ed3d2441ed61db997519085747ff5a45b3c942e5be127c3 not found: ID does not exist" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.690375 4837 scope.go:117] "RemoveContainer" containerID="23cf1d25c2a824231d5fd39eb5a9920394e587e257254f6ffa6bf893fe9f2624" Mar 13 11:52:54 crc kubenswrapper[4837]: E0313 11:52:54.690695 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23cf1d25c2a824231d5fd39eb5a9920394e587e257254f6ffa6bf893fe9f2624\": container with ID starting with 23cf1d25c2a824231d5fd39eb5a9920394e587e257254f6ffa6bf893fe9f2624 not found: ID does not exist" containerID="23cf1d25c2a824231d5fd39eb5a9920394e587e257254f6ffa6bf893fe9f2624" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.690749 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23cf1d25c2a824231d5fd39eb5a9920394e587e257254f6ffa6bf893fe9f2624"} err="failed to get container status \"23cf1d25c2a824231d5fd39eb5a9920394e587e257254f6ffa6bf893fe9f2624\": rpc error: code = NotFound desc = could not find container \"23cf1d25c2a824231d5fd39eb5a9920394e587e257254f6ffa6bf893fe9f2624\": container with ID starting with 23cf1d25c2a824231d5fd39eb5a9920394e587e257254f6ffa6bf893fe9f2624 not found: ID does not exist" Mar 13 11:52:55 crc kubenswrapper[4837]: I0313 11:52:55.055361 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" path="/var/lib/kubelet/pods/5236ae0e-b305-4f1c-9125-bbac1eeb07f3/volumes" Mar 13 11:52:55 crc kubenswrapper[4837]: I0313 11:52:55.686578 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j246z"] Mar 13 11:52:55 crc kubenswrapper[4837]: I0313 11:52:55.687014 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j246z" podUID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" containerName="registry-server" containerID="cri-o://14abcb9ea1fbbb60b399b29e871faa61fa90553e6b8ac0c1201e048e766e55b2" gracePeriod=2 Mar 13 11:52:56 crc kubenswrapper[4837]: I0313 11:52:56.634721 4837 generic.go:334] "Generic (PLEG): container finished" podID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" containerID="14abcb9ea1fbbb60b399b29e871faa61fa90553e6b8ac0c1201e048e766e55b2" exitCode=0 Mar 13 11:52:56 crc kubenswrapper[4837]: I0313 11:52:56.634764 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j246z" event={"ID":"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6","Type":"ContainerDied","Data":"14abcb9ea1fbbb60b399b29e871faa61fa90553e6b8ac0c1201e048e766e55b2"} Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.185321 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.261844 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-utilities\") pod \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\" (UID: \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\") " Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.261928 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-catalog-content\") pod \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\" (UID: \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\") " Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.261953 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx4zq\" (UniqueName: \"kubernetes.io/projected/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-kube-api-access-xx4zq\") pod \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\" (UID: \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\") " Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.262897 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-utilities" (OuterVolumeSpecName: "utilities") pod "32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" (UID: "32a36cbe-a17f-46bf-9c6a-1df6f427e2c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.266943 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-kube-api-access-xx4zq" (OuterVolumeSpecName: "kube-api-access-xx4zq") pod "32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" (UID: "32a36cbe-a17f-46bf-9c6a-1df6f427e2c6"). InnerVolumeSpecName "kube-api-access-xx4zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.362872 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.362899 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx4zq\" (UniqueName: \"kubernetes.io/projected/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-kube-api-access-xx4zq\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.404903 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" (UID: "32a36cbe-a17f-46bf-9c6a-1df6f427e2c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.464820 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.642780 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j246z" event={"ID":"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6","Type":"ContainerDied","Data":"524d259feb76e4121fddc10b32a9829c69c7b137ab82d2d2c18f81ea9d556b60"} Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.642833 4837 scope.go:117] "RemoveContainer" containerID="14abcb9ea1fbbb60b399b29e871faa61fa90553e6b8ac0c1201e048e766e55b2" Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.642839 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.660151 4837 scope.go:117] "RemoveContainer" containerID="73bd112e9625df12ee2d18dc3c732843142d27d7ee69bd7e78c3b55fe032dc84" Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.679328 4837 scope.go:117] "RemoveContainer" containerID="613107c1ce24dcf9cb1cf0c1623f3de9a7d5b33bc09c57a646911cae7011d82e" Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.680234 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j246z"] Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.687743 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j246z"] Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.128118 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bb68b4479-msngs"] Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.128346 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" podUID="5b640fc3-2425-48b0-adfa-3300a6d52002" containerName="controller-manager" containerID="cri-o://a06e6485ac9f8f31411ee3ba603ad966da0a6cdc4ea8a84b4df845f32f9ff0f0" gracePeriod=30 Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.230458 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb"] Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.231293 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" podUID="e6fdbbb9-292f-4621-892a-53a6c1c13f65" containerName="route-controller-manager" containerID="cri-o://7fdcd49d58a893ad2332b8993c2a1cdba6ab455eb48a11b6f32d6c2c777e93c3" gracePeriod=30 Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.649889 4837 generic.go:334] "Generic (PLEG): container finished" podID="5b640fc3-2425-48b0-adfa-3300a6d52002" containerID="a06e6485ac9f8f31411ee3ba603ad966da0a6cdc4ea8a84b4df845f32f9ff0f0" exitCode=0 Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.649966 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" event={"ID":"5b640fc3-2425-48b0-adfa-3300a6d52002","Type":"ContainerDied","Data":"a06e6485ac9f8f31411ee3ba603ad966da0a6cdc4ea8a84b4df845f32f9ff0f0"} Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.658369 4837 generic.go:334] "Generic (PLEG): container finished" podID="e6fdbbb9-292f-4621-892a-53a6c1c13f65" containerID="7fdcd49d58a893ad2332b8993c2a1cdba6ab455eb48a11b6f32d6c2c777e93c3" exitCode=0 Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.658404 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" event={"ID":"e6fdbbb9-292f-4621-892a-53a6c1c13f65","Type":"ContainerDied","Data":"7fdcd49d58a893ad2332b8993c2a1cdba6ab455eb48a11b6f32d6c2c777e93c3"} Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.784996 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.791725 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.893944 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-proxy-ca-bundles\") pod \"5b640fc3-2425-48b0-adfa-3300a6d52002\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.893993 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rft5b\" (UniqueName: \"kubernetes.io/projected/e6fdbbb9-292f-4621-892a-53a6c1c13f65-kube-api-access-rft5b\") pod \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.894029 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6fdbbb9-292f-4621-892a-53a6c1c13f65-config\") pod \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.894060 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6fdbbb9-292f-4621-892a-53a6c1c13f65-serving-cert\") pod \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.894092 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b640fc3-2425-48b0-adfa-3300a6d52002-serving-cert\") pod \"5b640fc3-2425-48b0-adfa-3300a6d52002\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.894114 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-client-ca\") pod \"5b640fc3-2425-48b0-adfa-3300a6d52002\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.894132 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kndkg\" (UniqueName: \"kubernetes.io/projected/5b640fc3-2425-48b0-adfa-3300a6d52002-kube-api-access-kndkg\") pod \"5b640fc3-2425-48b0-adfa-3300a6d52002\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.894154 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6fdbbb9-292f-4621-892a-53a6c1c13f65-client-ca\") pod \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.894230 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-config\") pod \"5b640fc3-2425-48b0-adfa-3300a6d52002\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.895072 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6fdbbb9-292f-4621-892a-53a6c1c13f65-client-ca" (OuterVolumeSpecName: "client-ca") pod "e6fdbbb9-292f-4621-892a-53a6c1c13f65" (UID: "e6fdbbb9-292f-4621-892a-53a6c1c13f65"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.895150 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6fdbbb9-292f-4621-892a-53a6c1c13f65-config" (OuterVolumeSpecName: "config") pod "e6fdbbb9-292f-4621-892a-53a6c1c13f65" (UID: "e6fdbbb9-292f-4621-892a-53a6c1c13f65"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.895169 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5b640fc3-2425-48b0-adfa-3300a6d52002" (UID: "5b640fc3-2425-48b0-adfa-3300a6d52002"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.895188 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-client-ca" (OuterVolumeSpecName: "client-ca") pod "5b640fc3-2425-48b0-adfa-3300a6d52002" (UID: "5b640fc3-2425-48b0-adfa-3300a6d52002"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.895250 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-config" (OuterVolumeSpecName: "config") pod "5b640fc3-2425-48b0-adfa-3300a6d52002" (UID: "5b640fc3-2425-48b0-adfa-3300a6d52002"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.895459 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.895473 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.895482 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6fdbbb9-292f-4621-892a-53a6c1c13f65-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.895490 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.895498 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6fdbbb9-292f-4621-892a-53a6c1c13f65-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.907882 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b640fc3-2425-48b0-adfa-3300a6d52002-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5b640fc3-2425-48b0-adfa-3300a6d52002" (UID: "5b640fc3-2425-48b0-adfa-3300a6d52002"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.907915 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6fdbbb9-292f-4621-892a-53a6c1c13f65-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e6fdbbb9-292f-4621-892a-53a6c1c13f65" (UID: "e6fdbbb9-292f-4621-892a-53a6c1c13f65"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.907952 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b640fc3-2425-48b0-adfa-3300a6d52002-kube-api-access-kndkg" (OuterVolumeSpecName: "kube-api-access-kndkg") pod "5b640fc3-2425-48b0-adfa-3300a6d52002" (UID: "5b640fc3-2425-48b0-adfa-3300a6d52002"). InnerVolumeSpecName "kube-api-access-kndkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.908060 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6fdbbb9-292f-4621-892a-53a6c1c13f65-kube-api-access-rft5b" (OuterVolumeSpecName: "kube-api-access-rft5b") pod "e6fdbbb9-292f-4621-892a-53a6c1c13f65" (UID: "e6fdbbb9-292f-4621-892a-53a6c1c13f65"). InnerVolumeSpecName "kube-api-access-rft5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.996671 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6fdbbb9-292f-4621-892a-53a6c1c13f65-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.996708 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b640fc3-2425-48b0-adfa-3300a6d52002-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.996719 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kndkg\" (UniqueName: \"kubernetes.io/projected/5b640fc3-2425-48b0-adfa-3300a6d52002-kube-api-access-kndkg\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.996730 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rft5b\" (UniqueName: \"kubernetes.io/projected/e6fdbbb9-292f-4621-892a-53a6c1c13f65-kube-api-access-rft5b\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.054062 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" path="/var/lib/kubelet/pods/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6/volumes" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.562189 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j"] Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.562560 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b640fc3-2425-48b0-adfa-3300a6d52002" containerName="controller-manager" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.562594 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b640fc3-2425-48b0-adfa-3300a6d52002" containerName="controller-manager" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.562619 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e6ae52-59ef-446f-917a-549d34ffbf8e" containerName="extract-utilities" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.562668 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e6ae52-59ef-446f-917a-549d34ffbf8e" containerName="extract-utilities" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.562684 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6870caea-07d6-4465-86b1-645a2e29b240" containerName="extract-content" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.562700 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6870caea-07d6-4465-86b1-645a2e29b240" containerName="extract-content" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.562723 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" containerName="registry-server" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.562735 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" containerName="registry-server" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.562752 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e6ae52-59ef-446f-917a-549d34ffbf8e" containerName="extract-content" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.562763 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e6ae52-59ef-446f-917a-549d34ffbf8e" containerName="extract-content" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.562785 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" containerName="extract-content" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.562798 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" containerName="extract-content" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.562810 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" containerName="extract-content" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.562847 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" containerName="extract-content" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.562873 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" containerName="extract-utilities" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.562885 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" containerName="extract-utilities" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.562905 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" containerName="extract-utilities" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.562917 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" containerName="extract-utilities" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.562932 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" containerName="registry-server" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.562944 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" containerName="registry-server" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.562961 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6870caea-07d6-4465-86b1-645a2e29b240" containerName="extract-utilities" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.562974 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6870caea-07d6-4465-86b1-645a2e29b240" containerName="extract-utilities" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.562992 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6870caea-07d6-4465-86b1-645a2e29b240" containerName="registry-server" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.563005 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6870caea-07d6-4465-86b1-645a2e29b240" containerName="registry-server" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.563024 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6fdbbb9-292f-4621-892a-53a6c1c13f65" containerName="route-controller-manager" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.563036 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6fdbbb9-292f-4621-892a-53a6c1c13f65" containerName="route-controller-manager" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.563058 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e6ae52-59ef-446f-917a-549d34ffbf8e" containerName="registry-server" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.563070 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e6ae52-59ef-446f-917a-549d34ffbf8e" containerName="registry-server" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.563228 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="45e6ae52-59ef-446f-917a-549d34ffbf8e" containerName="registry-server" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.563261 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6870caea-07d6-4465-86b1-645a2e29b240" containerName="registry-server" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.563276 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6fdbbb9-292f-4621-892a-53a6c1c13f65" containerName="route-controller-manager" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.563296 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" containerName="registry-server" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.563313 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" containerName="registry-server" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.563326 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b640fc3-2425-48b0-adfa-3300a6d52002" containerName="controller-manager" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.564201 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.573543 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-68c5756767-4nmg2"] Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.574187 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j"] Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.574276 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.578146 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-68c5756767-4nmg2"] Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.612307 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w9b8\" (UniqueName: \"kubernetes.io/projected/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50-kube-api-access-5w9b8\") pod \"route-controller-manager-54bb9484b9-l9k8j\" (UID: \"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\") " pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.612410 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50-client-ca\") pod \"route-controller-manager-54bb9484b9-l9k8j\" (UID: \"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\") " pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.612526 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50-config\") pod \"route-controller-manager-54bb9484b9-l9k8j\" (UID: \"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\") " pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.612672 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50-serving-cert\") pod \"route-controller-manager-54bb9484b9-l9k8j\" (UID: \"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\") " pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.666414 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" event={"ID":"5b640fc3-2425-48b0-adfa-3300a6d52002","Type":"ContainerDied","Data":"097c9cec1e11f8aa83b30bf8b029f844e3b8d7cc463f16a2a3f8fa54330c0b4a"} Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.666442 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.666481 4837 scope.go:117] "RemoveContainer" containerID="a06e6485ac9f8f31411ee3ba603ad966da0a6cdc4ea8a84b4df845f32f9ff0f0" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.668130 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" event={"ID":"e6fdbbb9-292f-4621-892a-53a6c1c13f65","Type":"ContainerDied","Data":"ec6669f507f7c1917535f69f1388c95e2e8ec84b237ee008635f694dc17102e4"} Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.668247 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.688040 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bb68b4479-msngs"] Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.688978 4837 scope.go:117] "RemoveContainer" containerID="7fdcd49d58a893ad2332b8993c2a1cdba6ab455eb48a11b6f32d6c2c777e93c3" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.690811 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7bb68b4479-msngs"] Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.695487 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb"] Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.697987 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb"] Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.714074 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50-serving-cert\") pod \"route-controller-manager-54bb9484b9-l9k8j\" (UID: \"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\") " pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.714135 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/557e8146-afbb-41a0-a477-c69f4575656c-client-ca\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.714193 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8xbg\" (UniqueName: \"kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.714220 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w9b8\" (UniqueName: \"kubernetes.io/projected/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50-kube-api-access-5w9b8\") pod \"route-controller-manager-54bb9484b9-l9k8j\" (UID: \"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\") " pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.714240 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/557e8146-afbb-41a0-a477-c69f4575656c-config\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.714266 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/557e8146-afbb-41a0-a477-c69f4575656c-serving-cert\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.714286 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50-client-ca\") pod \"route-controller-manager-54bb9484b9-l9k8j\" (UID: \"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\") " pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.714308 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50-config\") pod \"route-controller-manager-54bb9484b9-l9k8j\" (UID: \"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\") " pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.714328 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/557e8146-afbb-41a0-a477-c69f4575656c-proxy-ca-bundles\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.715568 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50-client-ca\") pod \"route-controller-manager-54bb9484b9-l9k8j\" (UID: \"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\") " pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.715850 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50-config\") pod \"route-controller-manager-54bb9484b9-l9k8j\" (UID: \"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\") " pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.721977 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50-serving-cert\") pod \"route-controller-manager-54bb9484b9-l9k8j\" (UID: \"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\") " pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.730948 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w9b8\" (UniqueName: \"kubernetes.io/projected/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50-kube-api-access-5w9b8\") pod \"route-controller-manager-54bb9484b9-l9k8j\" (UID: \"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\") " pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.741441 4837 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742302 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742656 4837 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742681 4837 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.742788 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742800 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.742809 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742815 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.742823 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742829 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.742835 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742841 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.742848 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742856 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.742862 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742868 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.742877 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742883 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.742893 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742899 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.742907 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742913 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742993 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.743001 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.743009 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.743017 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.743024 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.743030 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.743039 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.743128 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.743135 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.743212 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab" gracePeriod=15 Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.743522 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.744211 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.744853 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea" gracePeriod=15 Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.744924 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2" gracePeriod=15 Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.745023 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208" gracePeriod=15 Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.745117 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17" gracePeriod=15 Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815326 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815371 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815388 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815411 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/557e8146-afbb-41a0-a477-c69f4575656c-client-ca\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815442 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815455 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815488 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8xbg\" (UniqueName: \"kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815505 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815549 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/557e8146-afbb-41a0-a477-c69f4575656c-config\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815568 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815588 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/557e8146-afbb-41a0-a477-c69f4575656c-serving-cert\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815610 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/557e8146-afbb-41a0-a477-c69f4575656c-proxy-ca-bundles\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815664 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.816285 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/557e8146-afbb-41a0-a477-c69f4575656c-client-ca\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.816441 4837 projected.go:194] Error preparing data for projected volume kube-api-access-r8xbg for pod openshift-controller-manager/controller-manager-68c5756767-4nmg2: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.816891 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/557e8146-afbb-41a0-a477-c69f4575656c-config\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.817152 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg podName:557e8146-afbb-41a0-a477-c69f4575656c nodeName:}" failed. No retries permitted until 2026-03-13 11:53:00.31712132 +0000 UTC m=+295.955388153 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-r8xbg" (UniqueName: "kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg") pod "controller-manager-68c5756767-4nmg2" (UID: "557e8146-afbb-41a0-a477-c69f4575656c") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.817514 4837 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.138:6443: connect: connection refused" event="&Event{ObjectMeta:{controller-manager-68c5756767-4nmg2.189c646eaeb065ea openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-68c5756767-4nmg2,UID:557e8146-afbb-41a0-a477-c69f4575656c,APIVersion:v1,ResourceVersion:29927,FieldPath:,},Reason:FailedMount,Message:MountVolume.SetUp failed for volume \"kube-api-access-r8xbg\" : failed to fetch token: Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token\": dial tcp 38.102.83.138:6443: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:52:59.81710897 +0000 UTC m=+295.455375743,LastTimestamp:2026-03-13 11:52:59.81710897 +0000 UTC m=+295.455375743,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.817726 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/557e8146-afbb-41a0-a477-c69f4575656c-proxy-ca-bundles\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.823195 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/557e8146-afbb-41a0-a477-c69f4575656c-serving-cert\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917117 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917193 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917214 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917230 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917260 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917273 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917281 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917313 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917319 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917321 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917360 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917362 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917368 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917337 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917385 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917399 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.928006 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.221990 4837 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.222322 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.323418 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8xbg\" (UniqueName: \"kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.324140 4837 projected.go:194] Error preparing data for projected volume kube-api-access-r8xbg for pod openshift-controller-manager/controller-manager-68c5756767-4nmg2: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.324253 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg podName:557e8146-afbb-41a0-a477-c69f4575656c nodeName:}" failed. No retries permitted until 2026-03-13 11:53:01.324219938 +0000 UTC m=+296.962486741 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-r8xbg" (UniqueName: "kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg") pod "controller-manager-68c5756767-4nmg2" (UID: "557e8146-afbb-41a0-a477-c69f4575656c") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.508904 4837 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 11:53:00 crc kubenswrapper[4837]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-54bb9484b9-l9k8j_openshift-route-controller-manager_c6c2dd46-4cc0-4802-b96e-7d395d3dbc50_0(0e521213d7075b7205e2d99e482f1c55c0f091f3d98c1703c32a1f05a4e7e878): error adding pod openshift-route-controller-manager_route-controller-manager-54bb9484b9-l9k8j to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0e521213d7075b7205e2d99e482f1c55c0f091f3d98c1703c32a1f05a4e7e878" Netns:"/var/run/netns/d70e04b4-380d-4dc7-a3e7-5f2471ba8c3a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-54bb9484b9-l9k8j;K8S_POD_INFRA_CONTAINER_ID=0e521213d7075b7205e2d99e482f1c55c0f091f3d98c1703c32a1f05a4e7e878;K8S_POD_UID=c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j] networking: Multus: [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-54bb9484b9-l9k8j?timeout=1m0s": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:00 crc kubenswrapper[4837]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 11:53:00 crc kubenswrapper[4837]: > Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.508984 4837 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 11:53:00 crc kubenswrapper[4837]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-54bb9484b9-l9k8j_openshift-route-controller-manager_c6c2dd46-4cc0-4802-b96e-7d395d3dbc50_0(0e521213d7075b7205e2d99e482f1c55c0f091f3d98c1703c32a1f05a4e7e878): error adding pod openshift-route-controller-manager_route-controller-manager-54bb9484b9-l9k8j to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0e521213d7075b7205e2d99e482f1c55c0f091f3d98c1703c32a1f05a4e7e878" Netns:"/var/run/netns/d70e04b4-380d-4dc7-a3e7-5f2471ba8c3a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-54bb9484b9-l9k8j;K8S_POD_INFRA_CONTAINER_ID=0e521213d7075b7205e2d99e482f1c55c0f091f3d98c1703c32a1f05a4e7e878;K8S_POD_UID=c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j] networking: Multus: [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-54bb9484b9-l9k8j?timeout=1m0s": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:00 crc kubenswrapper[4837]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 11:53:00 crc kubenswrapper[4837]: > pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.509008 4837 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 13 11:53:00 crc kubenswrapper[4837]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-54bb9484b9-l9k8j_openshift-route-controller-manager_c6c2dd46-4cc0-4802-b96e-7d395d3dbc50_0(0e521213d7075b7205e2d99e482f1c55c0f091f3d98c1703c32a1f05a4e7e878): error adding pod openshift-route-controller-manager_route-controller-manager-54bb9484b9-l9k8j to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0e521213d7075b7205e2d99e482f1c55c0f091f3d98c1703c32a1f05a4e7e878" Netns:"/var/run/netns/d70e04b4-380d-4dc7-a3e7-5f2471ba8c3a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-54bb9484b9-l9k8j;K8S_POD_INFRA_CONTAINER_ID=0e521213d7075b7205e2d99e482f1c55c0f091f3d98c1703c32a1f05a4e7e878;K8S_POD_UID=c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j] networking: Multus: [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-54bb9484b9-l9k8j?timeout=1m0s": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:00 crc kubenswrapper[4837]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 11:53:00 crc kubenswrapper[4837]: > pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.509106 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"route-controller-manager-54bb9484b9-l9k8j_openshift-route-controller-manager(c6c2dd46-4cc0-4802-b96e-7d395d3dbc50)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"route-controller-manager-54bb9484b9-l9k8j_openshift-route-controller-manager(c6c2dd46-4cc0-4802-b96e-7d395d3dbc50)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-54bb9484b9-l9k8j_openshift-route-controller-manager_c6c2dd46-4cc0-4802-b96e-7d395d3dbc50_0(0e521213d7075b7205e2d99e482f1c55c0f091f3d98c1703c32a1f05a4e7e878): error adding pod openshift-route-controller-manager_route-controller-manager-54bb9484b9-l9k8j to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"0e521213d7075b7205e2d99e482f1c55c0f091f3d98c1703c32a1f05a4e7e878\\\" Netns:\\\"/var/run/netns/d70e04b4-380d-4dc7-a3e7-5f2471ba8c3a\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-54bb9484b9-l9k8j;K8S_POD_INFRA_CONTAINER_ID=0e521213d7075b7205e2d99e482f1c55c0f091f3d98c1703c32a1f05a4e7e878;K8S_POD_UID=c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j] networking: Multus: [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-54bb9484b9-l9k8j?timeout=1m0s\\\": dial tcp 38.102.83.138:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" podUID="c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.631761 4837 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.632460 4837 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.633099 4837 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.633529 4837 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.633849 4837 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.633917 4837 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.634379 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="200ms" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.644676 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:53:00Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:53:00Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:53:00Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:53:00Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:1295a1f0e74ae87f51a733e28b64c6fdb6b9a5b069a6897b3870fe52cc1c3b0b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:505eeaa3f051e9f4ea6a622aca92e5c4eae07078ca185d9fecfe8cc9b6dfc899\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1739173859},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:4855408bd0e4d0711383d0c14dcad53c98255ff9f83f6cbefb57e47eacc1f1f1\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:97bdbb5854e4ad7976209a44cff02c8a2b9542f58ad007c06a5c3a5e8266def1\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1284762325},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:27f5385c5b700fb400a618b51a628f0db39afa4a8db03380252ca5abf49518da\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:3d8cd257adb4bde31657aa6b0fe5da54d74b1f9eda5457c8dee929ed64ecece0\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1221692102},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.645480 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.645963 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.646205 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.646545 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.646579 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.678175 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.679863 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.680833 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea" exitCode=0 Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.680866 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17" exitCode=0 Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.680889 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2" exitCode=0 Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.680898 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208" exitCode=2 Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.680947 4837 scope.go:117] "RemoveContainer" containerID="6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8" Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.685695 4837 generic.go:334] "Generic (PLEG): container finished" podID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" containerID="65e28d9ae9393725ced85e7d2690513d16c23a3765c0987cd0c005a1bb7bef87" exitCode=0 Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.685781 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"def1c7aa-51a6-4ee0-93d5-714721e9fc27","Type":"ContainerDied","Data":"65e28d9ae9393725ced85e7d2690513d16c23a3765c0987cd0c005a1bb7bef87"} Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.686298 4837 status_manager.go:851] "Failed to get status for pod" podUID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.686612 4837 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.687365 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.687804 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.835558 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="400ms" Mar 13 11:53:01 crc kubenswrapper[4837]: I0313 11:53:01.055306 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b640fc3-2425-48b0-adfa-3300a6d52002" path="/var/lib/kubelet/pods/5b640fc3-2425-48b0-adfa-3300a6d52002/volumes" Mar 13 11:53:01 crc kubenswrapper[4837]: I0313 11:53:01.056685 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6fdbbb9-292f-4621-892a-53a6c1c13f65" path="/var/lib/kubelet/pods/e6fdbbb9-292f-4621-892a-53a6c1c13f65/volumes" Mar 13 11:53:01 crc kubenswrapper[4837]: E0313 11:53:01.237321 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="800ms" Mar 13 11:53:01 crc kubenswrapper[4837]: E0313 11:53:01.238072 4837 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 11:53:01 crc kubenswrapper[4837]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-54bb9484b9-l9k8j_openshift-route-controller-manager_c6c2dd46-4cc0-4802-b96e-7d395d3dbc50_0(a8c46c0a779ef94bfb027c26c4a1554b6e9d988b0405909a29e7717aeba7d586): error adding pod openshift-route-controller-manager_route-controller-manager-54bb9484b9-l9k8j to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a8c46c0a779ef94bfb027c26c4a1554b6e9d988b0405909a29e7717aeba7d586" Netns:"/var/run/netns/e4fa0d47-327f-469f-9fcd-afaac8c1cd71" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-54bb9484b9-l9k8j;K8S_POD_INFRA_CONTAINER_ID=a8c46c0a779ef94bfb027c26c4a1554b6e9d988b0405909a29e7717aeba7d586;K8S_POD_UID=c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j] networking: Multus: [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-54bb9484b9-l9k8j?timeout=1m0s": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:01 crc kubenswrapper[4837]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 11:53:01 crc kubenswrapper[4837]: > Mar 13 11:53:01 crc kubenswrapper[4837]: E0313 11:53:01.238149 4837 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 11:53:01 crc kubenswrapper[4837]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-54bb9484b9-l9k8j_openshift-route-controller-manager_c6c2dd46-4cc0-4802-b96e-7d395d3dbc50_0(a8c46c0a779ef94bfb027c26c4a1554b6e9d988b0405909a29e7717aeba7d586): error adding pod openshift-route-controller-manager_route-controller-manager-54bb9484b9-l9k8j to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a8c46c0a779ef94bfb027c26c4a1554b6e9d988b0405909a29e7717aeba7d586" Netns:"/var/run/netns/e4fa0d47-327f-469f-9fcd-afaac8c1cd71" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-54bb9484b9-l9k8j;K8S_POD_INFRA_CONTAINER_ID=a8c46c0a779ef94bfb027c26c4a1554b6e9d988b0405909a29e7717aeba7d586;K8S_POD_UID=c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j] networking: Multus: [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-54bb9484b9-l9k8j?timeout=1m0s": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:01 crc kubenswrapper[4837]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 11:53:01 crc kubenswrapper[4837]: > pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:53:01 crc kubenswrapper[4837]: E0313 11:53:01.238173 4837 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 13 11:53:01 crc kubenswrapper[4837]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-54bb9484b9-l9k8j_openshift-route-controller-manager_c6c2dd46-4cc0-4802-b96e-7d395d3dbc50_0(a8c46c0a779ef94bfb027c26c4a1554b6e9d988b0405909a29e7717aeba7d586): error adding pod openshift-route-controller-manager_route-controller-manager-54bb9484b9-l9k8j to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a8c46c0a779ef94bfb027c26c4a1554b6e9d988b0405909a29e7717aeba7d586" Netns:"/var/run/netns/e4fa0d47-327f-469f-9fcd-afaac8c1cd71" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-54bb9484b9-l9k8j;K8S_POD_INFRA_CONTAINER_ID=a8c46c0a779ef94bfb027c26c4a1554b6e9d988b0405909a29e7717aeba7d586;K8S_POD_UID=c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j] networking: Multus: [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-54bb9484b9-l9k8j?timeout=1m0s": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:01 crc kubenswrapper[4837]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 11:53:01 crc kubenswrapper[4837]: > pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:53:01 crc kubenswrapper[4837]: E0313 11:53:01.238244 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"route-controller-manager-54bb9484b9-l9k8j_openshift-route-controller-manager(c6c2dd46-4cc0-4802-b96e-7d395d3dbc50)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"route-controller-manager-54bb9484b9-l9k8j_openshift-route-controller-manager(c6c2dd46-4cc0-4802-b96e-7d395d3dbc50)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-54bb9484b9-l9k8j_openshift-route-controller-manager_c6c2dd46-4cc0-4802-b96e-7d395d3dbc50_0(a8c46c0a779ef94bfb027c26c4a1554b6e9d988b0405909a29e7717aeba7d586): error adding pod openshift-route-controller-manager_route-controller-manager-54bb9484b9-l9k8j to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"a8c46c0a779ef94bfb027c26c4a1554b6e9d988b0405909a29e7717aeba7d586\\\" Netns:\\\"/var/run/netns/e4fa0d47-327f-469f-9fcd-afaac8c1cd71\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-54bb9484b9-l9k8j;K8S_POD_INFRA_CONTAINER_ID=a8c46c0a779ef94bfb027c26c4a1554b6e9d988b0405909a29e7717aeba7d586;K8S_POD_UID=c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j] networking: Multus: [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-54bb9484b9-l9k8j?timeout=1m0s\\\": dial tcp 38.102.83.138:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" podUID="c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" Mar 13 11:53:01 crc kubenswrapper[4837]: I0313 11:53:01.335465 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8xbg\" (UniqueName: \"kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:53:01 crc kubenswrapper[4837]: E0313 11:53:01.336163 4837 projected.go:194] Error preparing data for projected volume kube-api-access-r8xbg for pod openshift-controller-manager/controller-manager-68c5756767-4nmg2: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:01 crc kubenswrapper[4837]: E0313 11:53:01.336253 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg podName:557e8146-afbb-41a0-a477-c69f4575656c nodeName:}" failed. No retries permitted until 2026-03-13 11:53:03.33623303 +0000 UTC m=+298.974499803 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-r8xbg" (UniqueName: "kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg") pod "controller-manager-68c5756767-4nmg2" (UID: "557e8146-afbb-41a0-a477-c69f4575656c") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:01 crc kubenswrapper[4837]: I0313 11:53:01.694152 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 11:53:01 crc kubenswrapper[4837]: I0313 11:53:01.983065 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:53:01 crc kubenswrapper[4837]: I0313 11:53:01.984297 4837 status_manager.go:851] "Failed to get status for pod" podUID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:02 crc kubenswrapper[4837]: E0313 11:53:02.038273 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="1.6s" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.042658 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/def1c7aa-51a6-4ee0-93d5-714721e9fc27-kubelet-dir\") pod \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\" (UID: \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\") " Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.042752 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/def1c7aa-51a6-4ee0-93d5-714721e9fc27-var-lock\") pod \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\" (UID: \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\") " Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.042775 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/def1c7aa-51a6-4ee0-93d5-714721e9fc27-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "def1c7aa-51a6-4ee0-93d5-714721e9fc27" (UID: "def1c7aa-51a6-4ee0-93d5-714721e9fc27"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.042798 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/def1c7aa-51a6-4ee0-93d5-714721e9fc27-kube-api-access\") pod \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\" (UID: \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\") " Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.042813 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/def1c7aa-51a6-4ee0-93d5-714721e9fc27-var-lock" (OuterVolumeSpecName: "var-lock") pod "def1c7aa-51a6-4ee0-93d5-714721e9fc27" (UID: "def1c7aa-51a6-4ee0-93d5-714721e9fc27"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.043124 4837 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/def1c7aa-51a6-4ee0-93d5-714721e9fc27-var-lock\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.043144 4837 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/def1c7aa-51a6-4ee0-93d5-714721e9fc27-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.048392 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/def1c7aa-51a6-4ee0-93d5-714721e9fc27-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "def1c7aa-51a6-4ee0-93d5-714721e9fc27" (UID: "def1c7aa-51a6-4ee0-93d5-714721e9fc27"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.118513 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.119338 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.119991 4837 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.120452 4837 status_manager.go:851] "Failed to get status for pod" podUID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.144269 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/def1c7aa-51a6-4ee0-93d5-714721e9fc27-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.245029 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.245108 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.245198 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.245214 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.245288 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.245401 4837 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.245416 4837 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.245461 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.346427 4837 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.710775 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.711697 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab" exitCode=0 Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.711813 4837 scope.go:117] "RemoveContainer" containerID="abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.711826 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.715056 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"def1c7aa-51a6-4ee0-93d5-714721e9fc27","Type":"ContainerDied","Data":"971c2474cd87ef98e0d1f40d16055d082ae34c7f72c4c883a6feb86fd0ff4ce0"} Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.715113 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="971c2474cd87ef98e0d1f40d16055d082ae34c7f72c4c883a6feb86fd0ff4ce0" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.715370 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.729179 4837 scope.go:117] "RemoveContainer" containerID="682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.740158 4837 status_manager.go:851] "Failed to get status for pod" podUID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.740803 4837 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.744538 4837 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.745293 4837 status_manager.go:851] "Failed to get status for pod" podUID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.751940 4837 scope.go:117] "RemoveContainer" containerID="9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.764740 4837 scope.go:117] "RemoveContainer" containerID="804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.776783 4837 scope.go:117] "RemoveContainer" containerID="f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.795110 4837 scope.go:117] "RemoveContainer" containerID="6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.812600 4837 scope.go:117] "RemoveContainer" containerID="abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea" Mar 13 11:53:02 crc kubenswrapper[4837]: E0313 11:53:02.813151 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\": container with ID starting with abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea not found: ID does not exist" containerID="abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.813200 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea"} err="failed to get container status \"abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\": rpc error: code = NotFound desc = could not find container \"abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\": container with ID starting with abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea not found: ID does not exist" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.813236 4837 scope.go:117] "RemoveContainer" containerID="682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17" Mar 13 11:53:02 crc kubenswrapper[4837]: E0313 11:53:02.814087 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\": container with ID starting with 682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17 not found: ID does not exist" containerID="682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.814168 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17"} err="failed to get container status \"682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\": rpc error: code = NotFound desc = could not find container \"682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\": container with ID starting with 682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17 not found: ID does not exist" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.814213 4837 scope.go:117] "RemoveContainer" containerID="9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2" Mar 13 11:53:02 crc kubenswrapper[4837]: E0313 11:53:02.814847 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\": container with ID starting with 9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2 not found: ID does not exist" containerID="9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.814879 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2"} err="failed to get container status \"9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\": rpc error: code = NotFound desc = could not find container \"9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\": container with ID starting with 9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2 not found: ID does not exist" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.814931 4837 scope.go:117] "RemoveContainer" containerID="804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208" Mar 13 11:53:02 crc kubenswrapper[4837]: E0313 11:53:02.815433 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\": container with ID starting with 804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208 not found: ID does not exist" containerID="804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.815491 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208"} err="failed to get container status \"804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\": rpc error: code = NotFound desc = could not find container \"804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\": container with ID starting with 804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208 not found: ID does not exist" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.815516 4837 scope.go:117] "RemoveContainer" containerID="f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab" Mar 13 11:53:02 crc kubenswrapper[4837]: E0313 11:53:02.815789 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\": container with ID starting with f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab not found: ID does not exist" containerID="f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.815821 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab"} err="failed to get container status \"f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\": rpc error: code = NotFound desc = could not find container \"f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\": container with ID starting with f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab not found: ID does not exist" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.815838 4837 scope.go:117] "RemoveContainer" containerID="6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75" Mar 13 11:53:02 crc kubenswrapper[4837]: E0313 11:53:02.816265 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\": container with ID starting with 6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75 not found: ID does not exist" containerID="6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.816293 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75"} err="failed to get container status \"6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\": rpc error: code = NotFound desc = could not find container \"6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\": container with ID starting with 6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75 not found: ID does not exist" Mar 13 11:53:03 crc kubenswrapper[4837]: I0313 11:53:03.056456 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 13 11:53:03 crc kubenswrapper[4837]: I0313 11:53:03.362134 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8xbg\" (UniqueName: \"kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:53:03 crc kubenswrapper[4837]: E0313 11:53:03.362767 4837 projected.go:194] Error preparing data for projected volume kube-api-access-r8xbg for pod openshift-controller-manager/controller-manager-68c5756767-4nmg2: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:03 crc kubenswrapper[4837]: E0313 11:53:03.362850 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg podName:557e8146-afbb-41a0-a477-c69f4575656c nodeName:}" failed. No retries permitted until 2026-03-13 11:53:07.362827541 +0000 UTC m=+303.001094304 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-r8xbg" (UniqueName: "kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg") pod "controller-manager-68c5756767-4nmg2" (UID: "557e8146-afbb-41a0-a477-c69f4575656c") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:03 crc kubenswrapper[4837]: E0313 11:53:03.639483 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="3.2s" Mar 13 11:53:04 crc kubenswrapper[4837]: E0313 11:53:04.786615 4837 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:53:04 crc kubenswrapper[4837]: I0313 11:53:04.793294 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:53:04 crc kubenswrapper[4837]: W0313 11:53:04.821021 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-472f6fd64f8b2b79919c1f52a7a66b4196ddee94f4c73bc4bae77a352b472876 WatchSource:0}: Error finding container 472f6fd64f8b2b79919c1f52a7a66b4196ddee94f4c73bc4bae77a352b472876: Status 404 returned error can't find the container with id 472f6fd64f8b2b79919c1f52a7a66b4196ddee94f4c73bc4bae77a352b472876 Mar 13 11:53:05 crc kubenswrapper[4837]: I0313 11:53:05.050864 4837 status_manager.go:851] "Failed to get status for pod" podUID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:05 crc kubenswrapper[4837]: I0313 11:53:05.733369 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f462647c368a2c9ec95ac55fb1cc58505023aef3779f0d3c720f9d8861e5c80e"} Mar 13 11:53:05 crc kubenswrapper[4837]: I0313 11:53:05.733660 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"472f6fd64f8b2b79919c1f52a7a66b4196ddee94f4c73bc4bae77a352b472876"} Mar 13 11:53:05 crc kubenswrapper[4837]: E0313 11:53:05.734259 4837 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:53:05 crc kubenswrapper[4837]: I0313 11:53:05.734359 4837 status_manager.go:851] "Failed to get status for pod" podUID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:06 crc kubenswrapper[4837]: E0313 11:53:06.115874 4837 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.138:6443: connect: connection refused" event="&Event{ObjectMeta:{controller-manager-68c5756767-4nmg2.189c646eaeb065ea openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-68c5756767-4nmg2,UID:557e8146-afbb-41a0-a477-c69f4575656c,APIVersion:v1,ResourceVersion:29927,FieldPath:,},Reason:FailedMount,Message:MountVolume.SetUp failed for volume \"kube-api-access-r8xbg\" : failed to fetch token: Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token\": dial tcp 38.102.83.138:6443: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:52:59.81710897 +0000 UTC m=+295.455375743,LastTimestamp:2026-03-13 11:52:59.81710897 +0000 UTC m=+295.455375743,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:53:06 crc kubenswrapper[4837]: E0313 11:53:06.840517 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="6.4s" Mar 13 11:53:07 crc kubenswrapper[4837]: I0313 11:53:07.419012 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8xbg\" (UniqueName: \"kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:53:07 crc kubenswrapper[4837]: E0313 11:53:07.420096 4837 projected.go:194] Error preparing data for projected volume kube-api-access-r8xbg for pod openshift-controller-manager/controller-manager-68c5756767-4nmg2: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:07 crc kubenswrapper[4837]: E0313 11:53:07.420173 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg podName:557e8146-afbb-41a0-a477-c69f4575656c nodeName:}" failed. No retries permitted until 2026-03-13 11:53:15.420152635 +0000 UTC m=+311.058419398 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-r8xbg" (UniqueName: "kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg") pod "controller-manager-68c5756767-4nmg2" (UID: "557e8146-afbb-41a0-a477-c69f4575656c") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:10 crc kubenswrapper[4837]: E0313 11:53:10.880409 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:53:10Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:53:10Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:53:10Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:53:10Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:1295a1f0e74ae87f51a733e28b64c6fdb6b9a5b069a6897b3870fe52cc1c3b0b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:505eeaa3f051e9f4ea6a622aca92e5c4eae07078ca185d9fecfe8cc9b6dfc899\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1739173859},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:4855408bd0e4d0711383d0c14dcad53c98255ff9f83f6cbefb57e47eacc1f1f1\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:97bdbb5854e4ad7976209a44cff02c8a2b9542f58ad007c06a5c3a5e8266def1\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1284762325},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:27f5385c5b700fb400a618b51a628f0db39afa4a8db03380252ca5abf49518da\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:3d8cd257adb4bde31657aa6b0fe5da54d74b1f9eda5457c8dee929ed64ecece0\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1221692102},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:10 crc kubenswrapper[4837]: E0313 11:53:10.881503 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:10 crc kubenswrapper[4837]: E0313 11:53:10.882141 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:10 crc kubenswrapper[4837]: E0313 11:53:10.882448 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:10 crc kubenswrapper[4837]: E0313 11:53:10.882776 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:10 crc kubenswrapper[4837]: E0313 11:53:10.882807 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:53:11 crc kubenswrapper[4837]: I0313 11:53:11.047763 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:11 crc kubenswrapper[4837]: I0313 11:53:11.049414 4837 status_manager.go:851] "Failed to get status for pod" podUID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:11 crc kubenswrapper[4837]: I0313 11:53:11.073576 4837 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93dcd114-c39a-4b27-aa9c-a42e3ef7cd79" Mar 13 11:53:11 crc kubenswrapper[4837]: I0313 11:53:11.073618 4837 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93dcd114-c39a-4b27-aa9c-a42e3ef7cd79" Mar 13 11:53:11 crc kubenswrapper[4837]: E0313 11:53:11.074218 4837 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:11 crc kubenswrapper[4837]: I0313 11:53:11.074802 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:11 crc kubenswrapper[4837]: I0313 11:53:11.815624 4837 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="45e7881172f4f96a2ab523ddff9d11417fc1f83ee6944eb7846886e80e9ec03b" exitCode=0 Mar 13 11:53:11 crc kubenswrapper[4837]: I0313 11:53:11.815742 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"45e7881172f4f96a2ab523ddff9d11417fc1f83ee6944eb7846886e80e9ec03b"} Mar 13 11:53:11 crc kubenswrapper[4837]: I0313 11:53:11.816335 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c253ac6482cc3e65aaeedf2ec09af79a4403516e10e60eabd19cc7da59376637"} Mar 13 11:53:11 crc kubenswrapper[4837]: I0313 11:53:11.816846 4837 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93dcd114-c39a-4b27-aa9c-a42e3ef7cd79" Mar 13 11:53:11 crc kubenswrapper[4837]: I0313 11:53:11.816885 4837 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93dcd114-c39a-4b27-aa9c-a42e3ef7cd79" Mar 13 11:53:11 crc kubenswrapper[4837]: I0313 11:53:11.817326 4837 status_manager.go:851] "Failed to get status for pod" podUID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:11 crc kubenswrapper[4837]: E0313 11:53:11.817446 4837 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:12 crc kubenswrapper[4837]: I0313 11:53:12.825329 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"466a4dd4a2a6112b71c193c861b87265fa79034d9da85058dae83b5b0ed36623"} Mar 13 11:53:12 crc kubenswrapper[4837]: I0313 11:53:12.825730 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cb18eaac47947fa66afcc1f892beb971a140cbd7756952a8edf0fd75614e023a"} Mar 13 11:53:12 crc kubenswrapper[4837]: I0313 11:53:12.825746 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"91f9e508710d7c4605dd48a03bff40e6ad2f5986c090a33921c24d6137f14829"} Mar 13 11:53:12 crc kubenswrapper[4837]: I0313 11:53:12.825758 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9aec12bda443dab2c3646cea5c65c999fbc3a967fd5e31db688ed90bdf968f4f"} Mar 13 11:53:13 crc kubenswrapper[4837]: I0313 11:53:13.833322 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1e519516e8e77325ae252ee0ce1635557d69f2d3c4b19cb91e4d081292c04e70"} Mar 13 11:53:13 crc kubenswrapper[4837]: I0313 11:53:13.833685 4837 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93dcd114-c39a-4b27-aa9c-a42e3ef7cd79" Mar 13 11:53:13 crc kubenswrapper[4837]: I0313 11:53:13.833713 4837 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93dcd114-c39a-4b27-aa9c-a42e3ef7cd79" Mar 13 11:53:13 crc kubenswrapper[4837]: I0313 11:53:13.833881 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:14 crc kubenswrapper[4837]: I0313 11:53:14.047914 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:53:14 crc kubenswrapper[4837]: I0313 11:53:14.048484 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:53:14 crc kubenswrapper[4837]: I0313 11:53:14.841372 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 11:53:14 crc kubenswrapper[4837]: I0313 11:53:14.842615 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 13 11:53:14 crc kubenswrapper[4837]: I0313 11:53:14.842669 4837 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="29dcf2d4dbca31492c07df5fcf50217d44ab7914e536e5ae6d8187e8b2b3e62f" exitCode=1 Mar 13 11:53:14 crc kubenswrapper[4837]: I0313 11:53:14.842704 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"29dcf2d4dbca31492c07df5fcf50217d44ab7914e536e5ae6d8187e8b2b3e62f"} Mar 13 11:53:14 crc kubenswrapper[4837]: I0313 11:53:14.843154 4837 scope.go:117] "RemoveContainer" containerID="29dcf2d4dbca31492c07df5fcf50217d44ab7914e536e5ae6d8187e8b2b3e62f" Mar 13 11:53:15 crc kubenswrapper[4837]: I0313 11:53:15.456083 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8xbg\" (UniqueName: \"kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:53:15 crc kubenswrapper[4837]: I0313 11:53:15.479675 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8xbg\" (UniqueName: \"kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:53:15 crc kubenswrapper[4837]: I0313 11:53:15.536285 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:53:15 crc kubenswrapper[4837]: I0313 11:53:15.853706 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 11:53:15 crc kubenswrapper[4837]: I0313 11:53:15.855042 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 13 11:53:15 crc kubenswrapper[4837]: I0313 11:53:15.855102 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"82a2cfe3baf1dac4243d4c8eb1e4cc4e0aabd08de67ece4654f31efea4f2dadf"} Mar 13 11:53:15 crc kubenswrapper[4837]: W0313 11:53:15.932525 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod557e8146_afbb_41a0_a477_c69f4575656c.slice/crio-c261a0e9e80a95ebb82e7949fc3a9fbd1359cabd2a415e857d84f9967ed8eb5d WatchSource:0}: Error finding container c261a0e9e80a95ebb82e7949fc3a9fbd1359cabd2a415e857d84f9967ed8eb5d: Status 404 returned error can't find the container with id c261a0e9e80a95ebb82e7949fc3a9fbd1359cabd2a415e857d84f9967ed8eb5d Mar 13 11:53:16 crc kubenswrapper[4837]: I0313 11:53:16.075231 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:16 crc kubenswrapper[4837]: I0313 11:53:16.075819 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:16 crc kubenswrapper[4837]: I0313 11:53:16.081213 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:16 crc kubenswrapper[4837]: I0313 11:53:16.860884 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" event={"ID":"557e8146-afbb-41a0-a477-c69f4575656c","Type":"ContainerStarted","Data":"bd0597c1cc32fbd78f39ed829a4641d444991994c57793796768416306af6501"} Mar 13 11:53:16 crc kubenswrapper[4837]: I0313 11:53:16.861233 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:53:16 crc kubenswrapper[4837]: I0313 11:53:16.861247 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" event={"ID":"557e8146-afbb-41a0-a477-c69f4575656c","Type":"ContainerStarted","Data":"c261a0e9e80a95ebb82e7949fc3a9fbd1359cabd2a415e857d84f9967ed8eb5d"} Mar 13 11:53:16 crc kubenswrapper[4837]: I0313 11:53:16.866153 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:53:18 crc kubenswrapper[4837]: I0313 11:53:18.842179 4837 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:18 crc kubenswrapper[4837]: I0313 11:53:18.871010 4837 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93dcd114-c39a-4b27-aa9c-a42e3ef7cd79" Mar 13 11:53:18 crc kubenswrapper[4837]: I0313 11:53:18.871046 4837 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93dcd114-c39a-4b27-aa9c-a42e3ef7cd79" Mar 13 11:53:18 crc kubenswrapper[4837]: I0313 11:53:18.877553 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:18 crc kubenswrapper[4837]: I0313 11:53:18.902310 4837 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="5fce7cde-9b5b-40cc-8fdb-92fce8257be0" Mar 13 11:53:19 crc kubenswrapper[4837]: I0313 11:53:19.878091 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" event={"ID":"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50","Type":"ContainerStarted","Data":"37f72418a416002c47a4ca07b43700f505d4349cb995ba856c6770efb8dff147"} Mar 13 11:53:19 crc kubenswrapper[4837]: I0313 11:53:19.878439 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" event={"ID":"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50","Type":"ContainerStarted","Data":"babcfb3743f649875ff8952ae21dfa6891e88a59ef60ef37e4e768f1a82566fb"} Mar 13 11:53:19 crc kubenswrapper[4837]: I0313 11:53:19.878398 4837 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93dcd114-c39a-4b27-aa9c-a42e3ef7cd79" Mar 13 11:53:19 crc kubenswrapper[4837]: I0313 11:53:19.878551 4837 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93dcd114-c39a-4b27-aa9c-a42e3ef7cd79" Mar 13 11:53:19 crc kubenswrapper[4837]: I0313 11:53:19.878695 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:53:19 crc kubenswrapper[4837]: I0313 11:53:19.896270 4837 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="5fce7cde-9b5b-40cc-8fdb-92fce8257be0" Mar 13 11:53:19 crc kubenswrapper[4837]: I0313 11:53:19.944558 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:53:20 crc kubenswrapper[4837]: I0313 11:53:20.878709 4837 patch_prober.go:28] interesting pod/route-controller-manager-54bb9484b9-l9k8j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:53:20 crc kubenswrapper[4837]: I0313 11:53:20.878793 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" podUID="c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:53:21 crc kubenswrapper[4837]: I0313 11:53:21.879345 4837 patch_prober.go:28] interesting pod/route-controller-manager-54bb9484b9-l9k8j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:53:21 crc kubenswrapper[4837]: I0313 11:53:21.879419 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" podUID="c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:53:22 crc kubenswrapper[4837]: I0313 11:53:22.880067 4837 patch_prober.go:28] interesting pod/route-controller-manager-54bb9484b9-l9k8j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:53:22 crc kubenswrapper[4837]: I0313 11:53:22.880161 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" podUID="c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:53:23 crc kubenswrapper[4837]: I0313 11:53:23.760418 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:53:23 crc kubenswrapper[4837]: I0313 11:53:23.767830 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:53:25 crc kubenswrapper[4837]: I0313 11:53:25.897193 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 11:53:27 crc kubenswrapper[4837]: I0313 11:53:27.228972 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 11:53:27 crc kubenswrapper[4837]: I0313 11:53:27.470893 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 11:53:29 crc kubenswrapper[4837]: I0313 11:53:29.003316 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 11:53:29 crc kubenswrapper[4837]: I0313 11:53:29.130910 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 11:53:29 crc kubenswrapper[4837]: I0313 11:53:29.380789 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 11:53:29 crc kubenswrapper[4837]: I0313 11:53:29.685850 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 13 11:53:29 crc kubenswrapper[4837]: I0313 11:53:29.822758 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 11:53:29 crc kubenswrapper[4837]: I0313 11:53:29.951687 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:53:30 crc kubenswrapper[4837]: I0313 11:53:30.232890 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 11:53:30 crc kubenswrapper[4837]: I0313 11:53:30.333516 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 11:53:30 crc kubenswrapper[4837]: I0313 11:53:30.601759 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 11:53:30 crc kubenswrapper[4837]: I0313 11:53:30.727303 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 11:53:30 crc kubenswrapper[4837]: I0313 11:53:30.819419 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 11:53:30 crc kubenswrapper[4837]: I0313 11:53:30.928290 4837 patch_prober.go:28] interesting pod/route-controller-manager-54bb9484b9-l9k8j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:53:30 crc kubenswrapper[4837]: I0313 11:53:30.928371 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" podUID="c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:53:30 crc kubenswrapper[4837]: I0313 11:53:30.976076 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 11:53:31 crc kubenswrapper[4837]: I0313 11:53:31.054842 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 13 11:53:31 crc kubenswrapper[4837]: I0313 11:53:31.247733 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 11:53:31 crc kubenswrapper[4837]: I0313 11:53:31.355315 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 11:53:31 crc kubenswrapper[4837]: I0313 11:53:31.464951 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 11:53:31 crc kubenswrapper[4837]: I0313 11:53:31.535875 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 13 11:53:31 crc kubenswrapper[4837]: I0313 11:53:31.560808 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 13 11:53:31 crc kubenswrapper[4837]: I0313 11:53:31.934372 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 11:53:32 crc kubenswrapper[4837]: I0313 11:53:32.051151 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 11:53:32 crc kubenswrapper[4837]: I0313 11:53:32.133229 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 11:53:32 crc kubenswrapper[4837]: I0313 11:53:32.304254 4837 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 11:53:32 crc kubenswrapper[4837]: I0313 11:53:32.327846 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 11:53:32 crc kubenswrapper[4837]: I0313 11:53:32.365434 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 13 11:53:32 crc kubenswrapper[4837]: I0313 11:53:32.402305 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 11:53:32 crc kubenswrapper[4837]: I0313 11:53:32.504396 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 11:53:32 crc kubenswrapper[4837]: I0313 11:53:32.575520 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 11:53:32 crc kubenswrapper[4837]: I0313 11:53:32.595822 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 11:53:32 crc kubenswrapper[4837]: I0313 11:53:32.636365 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 11:53:32 crc kubenswrapper[4837]: I0313 11:53:32.884488 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.180102 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.196620 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.326333 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.467063 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.469199 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.541368 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.641138 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.720461 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.749474 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.832784 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.871853 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.898173 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.898408 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.906015 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.927050 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.982412 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.018887 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.018999 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.075060 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.140176 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.149010 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.225183 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.279452 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.281280 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.298738 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.305372 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.380159 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.463174 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.492729 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.719686 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.720168 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.724885 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.806667 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.102691 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.117844 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.173227 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.283070 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.388772 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.460865 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.487660 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.494276 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.495364 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.595527 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.629218 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.726952 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.931791 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.936378 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.936576 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.974414 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.000839 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.061731 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.081199 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.146022 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.237500 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.237510 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.290767 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.319484 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.399745 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.431095 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.499429 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.509574 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.567265 4837 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.570900 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.578126 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.614172 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.663789 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.692781 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.767457 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.808857 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.019979 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.081146 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.090756 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.120409 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.169261 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.224085 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.239077 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.258254 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.364973 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.395342 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.446916 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.523149 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.550014 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.631706 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.825178 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.826068 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.883617 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.928121 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.950565 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.955433 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.054283 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.061684 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.147138 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.171601 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.199263 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.227809 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.238591 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.250282 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.293759 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.306242 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.414171 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.453921 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.516533 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.519004 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.617378 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.744232 4837 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.796704 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.873250 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.875379 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.927286 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.958791 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.070475 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.107352 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.124541 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.158717 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.224710 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.260081 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.295327 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.299204 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.472019 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.523663 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.579282 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.675785 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.730812 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.736487 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.747273 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.778876 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.860138 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.880531 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.889192 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.971592 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.977732 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.025581 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.035255 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.118719 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.170958 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.177501 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.263971 4837 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.400695 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.468802 4837 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.486210 4837 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.490394 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" podStartSLOduration=42.490363247 podStartE2EDuration="42.490363247s" podCreationTimestamp="2026-03-13 11:52:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:53:18.87269223 +0000 UTC m=+314.510958993" watchObservedRunningTime="2026-03-13 11:53:40.490363247 +0000 UTC m=+336.128630050" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.497725 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" podStartSLOduration=42.497691755 podStartE2EDuration="42.497691755s" podCreationTimestamp="2026-03-13 11:52:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:53:19.894100288 +0000 UTC m=+315.532367051" watchObservedRunningTime="2026-03-13 11:53:40.497691755 +0000 UTC m=+336.135958548" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.504404 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.504482 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.504545 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-68c5756767-4nmg2","openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j"] Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.504983 4837 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93dcd114-c39a-4b27-aa9c-a42e3ef7cd79" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.505025 4837 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93dcd114-c39a-4b27-aa9c-a42e3ef7cd79" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.513395 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.526302 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=22.526285342 podStartE2EDuration="22.526285342s" podCreationTimestamp="2026-03-13 11:53:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:53:40.526076445 +0000 UTC m=+336.164343218" watchObservedRunningTime="2026-03-13 11:53:40.526285342 +0000 UTC m=+336.164552115" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.585626 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.624982 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.626969 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.654175 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.717724 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.731040 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.769595 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.812216 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.825905 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.930040 4837 patch_prober.go:28] interesting pod/route-controller-manager-54bb9484b9-l9k8j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.930309 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" podUID="c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.942787 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.947895 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.977542 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.991491 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.992911 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.098051 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.180488 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.240810 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.259086 4837 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.259369 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://f462647c368a2c9ec95ac55fb1cc58505023aef3779f0d3c720f9d8861e5c80e" gracePeriod=5 Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.379024 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.480909 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.512757 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.546579 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.643064 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.725596 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.800025 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.831659 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.848977 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.856563 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.873029 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.893478 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.930503 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.930542 4837 patch_prober.go:28] interesting pod/route-controller-manager-54bb9484b9-l9k8j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.930609 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" podUID="c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:53:42 crc kubenswrapper[4837]: I0313 11:53:42.038997 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 13 11:53:42 crc kubenswrapper[4837]: I0313 11:53:42.042159 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 11:53:42 crc kubenswrapper[4837]: I0313 11:53:42.110835 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 13 11:53:42 crc kubenswrapper[4837]: I0313 11:53:42.519322 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 11:53:42 crc kubenswrapper[4837]: I0313 11:53:42.564842 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 11:53:42 crc kubenswrapper[4837]: I0313 11:53:42.571157 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 11:53:42 crc kubenswrapper[4837]: I0313 11:53:42.638981 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 11:53:42 crc kubenswrapper[4837]: I0313 11:53:42.667514 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 11:53:42 crc kubenswrapper[4837]: I0313 11:53:42.707434 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 11:53:42 crc kubenswrapper[4837]: I0313 11:53:42.851979 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 11:53:42 crc kubenswrapper[4837]: I0313 11:53:42.989625 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 11:53:43 crc kubenswrapper[4837]: I0313 11:53:43.052908 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 11:53:43 crc kubenswrapper[4837]: I0313 11:53:43.065609 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 11:53:43 crc kubenswrapper[4837]: I0313 11:53:43.183968 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 11:53:43 crc kubenswrapper[4837]: I0313 11:53:43.268324 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 11:53:43 crc kubenswrapper[4837]: I0313 11:53:43.268571 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 11:53:43 crc kubenswrapper[4837]: I0313 11:53:43.554012 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 11:53:43 crc kubenswrapper[4837]: I0313 11:53:43.572113 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 13 11:53:43 crc kubenswrapper[4837]: I0313 11:53:43.574565 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 11:53:43 crc kubenswrapper[4837]: I0313 11:53:43.653417 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 11:53:43 crc kubenswrapper[4837]: I0313 11:53:43.826342 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 11:53:43 crc kubenswrapper[4837]: I0313 11:53:43.855045 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 13 11:53:43 crc kubenswrapper[4837]: I0313 11:53:43.923374 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 11:53:44 crc kubenswrapper[4837]: I0313 11:53:44.035791 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 13 11:53:44 crc kubenswrapper[4837]: I0313 11:53:44.043561 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 11:53:44 crc kubenswrapper[4837]: I0313 11:53:44.066201 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 11:53:44 crc kubenswrapper[4837]: I0313 11:53:44.237326 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 13 11:53:44 crc kubenswrapper[4837]: I0313 11:53:44.285627 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 13 11:53:44 crc kubenswrapper[4837]: I0313 11:53:44.360279 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 13 11:53:44 crc kubenswrapper[4837]: I0313 11:53:44.581277 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 11:53:44 crc kubenswrapper[4837]: I0313 11:53:44.671749 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 11:53:44 crc kubenswrapper[4837]: I0313 11:53:44.771101 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 13 11:53:44 crc kubenswrapper[4837]: I0313 11:53:44.811995 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 11:53:44 crc kubenswrapper[4837]: I0313 11:53:44.960542 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 11:53:45 crc kubenswrapper[4837]: I0313 11:53:45.020127 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 11:53:45 crc kubenswrapper[4837]: I0313 11:53:45.063349 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 11:53:45 crc kubenswrapper[4837]: I0313 11:53:45.259795 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 11:53:45 crc kubenswrapper[4837]: I0313 11:53:45.425025 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 13 11:53:45 crc kubenswrapper[4837]: I0313 11:53:45.452949 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 13 11:53:45 crc kubenswrapper[4837]: I0313 11:53:45.473733 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 11:53:45 crc kubenswrapper[4837]: I0313 11:53:45.489022 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 11:53:45 crc kubenswrapper[4837]: I0313 11:53:45.902451 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 13 11:53:45 crc kubenswrapper[4837]: I0313 11:53:45.948501 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.233711 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.351234 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.376678 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.391671 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.537768 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.700101 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.834209 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.834290 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.927867 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.960769 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.960818 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.960849 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.960866 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.960912 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.960897 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.960930 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.960952 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.960967 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.961171 4837 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.961189 4837 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.961200 4837 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.961212 4837 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.967905 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:53:47 crc kubenswrapper[4837]: I0313 11:53:47.008570 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 13 11:53:47 crc kubenswrapper[4837]: I0313 11:53:47.008624 4837 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="f462647c368a2c9ec95ac55fb1cc58505023aef3779f0d3c720f9d8861e5c80e" exitCode=137 Mar 13 11:53:47 crc kubenswrapper[4837]: I0313 11:53:47.008892 4837 scope.go:117] "RemoveContainer" containerID="f462647c368a2c9ec95ac55fb1cc58505023aef3779f0d3c720f9d8861e5c80e" Mar 13 11:53:47 crc kubenswrapper[4837]: I0313 11:53:47.008918 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:53:47 crc kubenswrapper[4837]: I0313 11:53:47.028322 4837 scope.go:117] "RemoveContainer" containerID="f462647c368a2c9ec95ac55fb1cc58505023aef3779f0d3c720f9d8861e5c80e" Mar 13 11:53:47 crc kubenswrapper[4837]: E0313 11:53:47.028693 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f462647c368a2c9ec95ac55fb1cc58505023aef3779f0d3c720f9d8861e5c80e\": container with ID starting with f462647c368a2c9ec95ac55fb1cc58505023aef3779f0d3c720f9d8861e5c80e not found: ID does not exist" containerID="f462647c368a2c9ec95ac55fb1cc58505023aef3779f0d3c720f9d8861e5c80e" Mar 13 11:53:47 crc kubenswrapper[4837]: I0313 11:53:47.028722 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f462647c368a2c9ec95ac55fb1cc58505023aef3779f0d3c720f9d8861e5c80e"} err="failed to get container status \"f462647c368a2c9ec95ac55fb1cc58505023aef3779f0d3c720f9d8861e5c80e\": rpc error: code = NotFound desc = could not find container \"f462647c368a2c9ec95ac55fb1cc58505023aef3779f0d3c720f9d8861e5c80e\": container with ID starting with f462647c368a2c9ec95ac55fb1cc58505023aef3779f0d3c720f9d8861e5c80e not found: ID does not exist" Mar 13 11:53:47 crc kubenswrapper[4837]: I0313 11:53:47.055786 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 13 11:53:47 crc kubenswrapper[4837]: I0313 11:53:47.062069 4837 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:49 crc kubenswrapper[4837]: I0313 11:53:49.933533 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.131434 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ft6cr"] Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.133289 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ft6cr" podUID="e6060cf2-077e-4112-af57-f100e297f320" containerName="registry-server" containerID="cri-o://981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1" gracePeriod=30 Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.138203 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-twtbj"] Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.138626 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-twtbj" podUID="278c91cc-2624-42cd-a35e-287e22d22f7d" containerName="registry-server" containerID="cri-o://e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c" gracePeriod=30 Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.152014 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8vgmn"] Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.152231 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" podUID="8fb85cad-ec2d-4ada-bd68-55937d96a779" containerName="marketplace-operator" containerID="cri-o://7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090" gracePeriod=30 Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.164443 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7crb6"] Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.164715 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7crb6" podUID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" containerName="registry-server" containerID="cri-o://caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa" gracePeriod=30 Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.173166 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7rzpc"] Mar 13 11:53:59 crc kubenswrapper[4837]: E0313 11:53:59.173472 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" containerName="installer" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.173496 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" containerName="installer" Mar 13 11:53:59 crc kubenswrapper[4837]: E0313 11:53:59.173513 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.173522 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.173668 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.173692 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" containerName="installer" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.175403 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.177199 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ng6kk"] Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.177421 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ng6kk" podUID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" containerName="registry-server" containerID="cri-o://1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0" gracePeriod=30 Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.181622 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7rzpc"] Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.232916 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b87c8f86-a346-4907-9441-048c3220646f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7rzpc\" (UID: \"b87c8f86-a346-4907-9441-048c3220646f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.232997 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b87c8f86-a346-4907-9441-048c3220646f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7rzpc\" (UID: \"b87c8f86-a346-4907-9441-048c3220646f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.233080 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgdm8\" (UniqueName: \"kubernetes.io/projected/b87c8f86-a346-4907-9441-048c3220646f-kube-api-access-rgdm8\") pod \"marketplace-operator-79b997595-7rzpc\" (UID: \"b87c8f86-a346-4907-9441-048c3220646f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.334560 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgdm8\" (UniqueName: \"kubernetes.io/projected/b87c8f86-a346-4907-9441-048c3220646f-kube-api-access-rgdm8\") pod \"marketplace-operator-79b997595-7rzpc\" (UID: \"b87c8f86-a346-4907-9441-048c3220646f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.334613 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b87c8f86-a346-4907-9441-048c3220646f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7rzpc\" (UID: \"b87c8f86-a346-4907-9441-048c3220646f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.334660 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b87c8f86-a346-4907-9441-048c3220646f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7rzpc\" (UID: \"b87c8f86-a346-4907-9441-048c3220646f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.340069 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b87c8f86-a346-4907-9441-048c3220646f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7rzpc\" (UID: \"b87c8f86-a346-4907-9441-048c3220646f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.348749 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b87c8f86-a346-4907-9441-048c3220646f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7rzpc\" (UID: \"b87c8f86-a346-4907-9441-048c3220646f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.374043 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgdm8\" (UniqueName: \"kubernetes.io/projected/b87c8f86-a346-4907-9441-048c3220646f-kube-api-access-rgdm8\") pod \"marketplace-operator-79b997595-7rzpc\" (UID: \"b87c8f86-a346-4907-9441-048c3220646f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.496578 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.732476 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.739204 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6060cf2-077e-4112-af57-f100e297f320-utilities\") pod \"e6060cf2-077e-4112-af57-f100e297f320\" (UID: \"e6060cf2-077e-4112-af57-f100e297f320\") " Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.739281 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfvgs\" (UniqueName: \"kubernetes.io/projected/e6060cf2-077e-4112-af57-f100e297f320-kube-api-access-gfvgs\") pod \"e6060cf2-077e-4112-af57-f100e297f320\" (UID: \"e6060cf2-077e-4112-af57-f100e297f320\") " Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.739327 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6060cf2-077e-4112-af57-f100e297f320-catalog-content\") pod \"e6060cf2-077e-4112-af57-f100e297f320\" (UID: \"e6060cf2-077e-4112-af57-f100e297f320\") " Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.744742 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6060cf2-077e-4112-af57-f100e297f320-utilities" (OuterVolumeSpecName: "utilities") pod "e6060cf2-077e-4112-af57-f100e297f320" (UID: "e6060cf2-077e-4112-af57-f100e297f320"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.745774 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6060cf2-077e-4112-af57-f100e297f320-kube-api-access-gfvgs" (OuterVolumeSpecName: "kube-api-access-gfvgs") pod "e6060cf2-077e-4112-af57-f100e297f320" (UID: "e6060cf2-077e-4112-af57-f100e297f320"). InnerVolumeSpecName "kube-api-access-gfvgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.841067 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6060cf2-077e-4112-af57-f100e297f320-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6060cf2-077e-4112-af57-f100e297f320" (UID: "e6060cf2-077e-4112-af57-f100e297f320"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.842171 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfvgs\" (UniqueName: \"kubernetes.io/projected/e6060cf2-077e-4112-af57-f100e297f320-kube-api-access-gfvgs\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.842216 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6060cf2-077e-4112-af57-f100e297f320-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.842228 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6060cf2-077e-4112-af57-f100e297f320-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.938589 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.942412 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-catalog-content\") pod \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\" (UID: \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\") " Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.942491 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpclz\" (UniqueName: \"kubernetes.io/projected/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-kube-api-access-vpclz\") pod \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\" (UID: \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\") " Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.942535 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-utilities\") pod \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\" (UID: \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\") " Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.943994 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-utilities" (OuterVolumeSpecName: "utilities") pod "bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" (UID: "bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.948773 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-kube-api-access-vpclz" (OuterVolumeSpecName: "kube-api-access-vpclz") pod "bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" (UID: "bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d"). InnerVolumeSpecName "kube-api-access-vpclz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.955750 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.958479 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.997153 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.043839 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fb85cad-ec2d-4ada-bd68-55937d96a779-marketplace-trusted-ca\") pod \"8fb85cad-ec2d-4ada-bd68-55937d96a779\" (UID: \"8fb85cad-ec2d-4ada-bd68-55937d96a779\") " Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.043937 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-catalog-content\") pod \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\" (UID: \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\") " Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.043965 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j25rc\" (UniqueName: \"kubernetes.io/projected/8fb85cad-ec2d-4ada-bd68-55937d96a779-kube-api-access-j25rc\") pod \"8fb85cad-ec2d-4ada-bd68-55937d96a779\" (UID: \"8fb85cad-ec2d-4ada-bd68-55937d96a779\") " Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.057185 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8fb85cad-ec2d-4ada-bd68-55937d96a779-marketplace-operator-metrics\") pod \"8fb85cad-ec2d-4ada-bd68-55937d96a779\" (UID: \"8fb85cad-ec2d-4ada-bd68-55937d96a779\") " Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.057235 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-utilities\") pod \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\" (UID: \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\") " Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.057267 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/278c91cc-2624-42cd-a35e-287e22d22f7d-utilities\") pod \"278c91cc-2624-42cd-a35e-287e22d22f7d\" (UID: \"278c91cc-2624-42cd-a35e-287e22d22f7d\") " Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.057300 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcgkl\" (UniqueName: \"kubernetes.io/projected/278c91cc-2624-42cd-a35e-287e22d22f7d-kube-api-access-bcgkl\") pod \"278c91cc-2624-42cd-a35e-287e22d22f7d\" (UID: \"278c91cc-2624-42cd-a35e-287e22d22f7d\") " Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.057379 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfddm\" (UniqueName: \"kubernetes.io/projected/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-kube-api-access-wfddm\") pod \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\" (UID: \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\") " Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.057458 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/278c91cc-2624-42cd-a35e-287e22d22f7d-catalog-content\") pod \"278c91cc-2624-42cd-a35e-287e22d22f7d\" (UID: \"278c91cc-2624-42cd-a35e-287e22d22f7d\") " Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.044473 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fb85cad-ec2d-4ada-bd68-55937d96a779-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "8fb85cad-ec2d-4ada-bd68-55937d96a779" (UID: "8fb85cad-ec2d-4ada-bd68-55937d96a779"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.047143 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fb85cad-ec2d-4ada-bd68-55937d96a779-kube-api-access-j25rc" (OuterVolumeSpecName: "kube-api-access-j25rc") pod "8fb85cad-ec2d-4ada-bd68-55937d96a779" (UID: "8fb85cad-ec2d-4ada-bd68-55937d96a779"). InnerVolumeSpecName "kube-api-access-j25rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.058351 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-utilities" (OuterVolumeSpecName: "utilities") pod "080747b0-3d43-4ff1-b21c-b8ea9fc2f961" (UID: "080747b0-3d43-4ff1-b21c-b8ea9fc2f961"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.060547 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/278c91cc-2624-42cd-a35e-287e22d22f7d-utilities" (OuterVolumeSpecName: "utilities") pod "278c91cc-2624-42cd-a35e-287e22d22f7d" (UID: "278c91cc-2624-42cd-a35e-287e22d22f7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.060749 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/278c91cc-2624-42cd-a35e-287e22d22f7d-kube-api-access-bcgkl" (OuterVolumeSpecName: "kube-api-access-bcgkl") pod "278c91cc-2624-42cd-a35e-287e22d22f7d" (UID: "278c91cc-2624-42cd-a35e-287e22d22f7d"). InnerVolumeSpecName "kube-api-access-bcgkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.061219 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpclz\" (UniqueName: \"kubernetes.io/projected/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-kube-api-access-vpclz\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.061247 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j25rc\" (UniqueName: \"kubernetes.io/projected/8fb85cad-ec2d-4ada-bd68-55937d96a779-kube-api-access-j25rc\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.061261 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.061272 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/278c91cc-2624-42cd-a35e-287e22d22f7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.061283 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcgkl\" (UniqueName: \"kubernetes.io/projected/278c91cc-2624-42cd-a35e-287e22d22f7d-kube-api-access-bcgkl\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.061293 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.061304 4837 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fb85cad-ec2d-4ada-bd68-55937d96a779-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.062831 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fb85cad-ec2d-4ada-bd68-55937d96a779-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "8fb85cad-ec2d-4ada-bd68-55937d96a779" (UID: "8fb85cad-ec2d-4ada-bd68-55937d96a779"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.068092 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7rzpc"] Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.076817 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-kube-api-access-wfddm" (OuterVolumeSpecName: "kube-api-access-wfddm") pod "080747b0-3d43-4ff1-b21c-b8ea9fc2f961" (UID: "080747b0-3d43-4ff1-b21c-b8ea9fc2f961"). InnerVolumeSpecName "kube-api-access-wfddm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.088109 4837 generic.go:334] "Generic (PLEG): container finished" podID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" containerID="1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0" exitCode=0 Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.088233 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.088294 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng6kk" event={"ID":"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d","Type":"ContainerDied","Data":"1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0"} Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.088486 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng6kk" event={"ID":"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d","Type":"ContainerDied","Data":"eac45e620e44e693cbb55f704b7783d81f0f024e3e2cf4051be3383dc9b6b145"} Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.088509 4837 scope.go:117] "RemoveContainer" containerID="1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.102135 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" event={"ID":"b87c8f86-a346-4907-9441-048c3220646f","Type":"ContainerStarted","Data":"a2529193e61a49baf66b7493e41d129b6cf282b6365689bc07cffc62fec9884b"} Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.102185 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" (UID: "bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.105601 4837 generic.go:334] "Generic (PLEG): container finished" podID="8fb85cad-ec2d-4ada-bd68-55937d96a779" containerID="7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090" exitCode=0 Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.105676 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" event={"ID":"8fb85cad-ec2d-4ada-bd68-55937d96a779","Type":"ContainerDied","Data":"7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090"} Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.105702 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" event={"ID":"8fb85cad-ec2d-4ada-bd68-55937d96a779","Type":"ContainerDied","Data":"32fbad917c53f080dae29a17b7d2e0db3f0b48efe2df248f03fa8431da965ad3"} Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.105706 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.116071 4837 generic.go:334] "Generic (PLEG): container finished" podID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" containerID="caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa" exitCode=0 Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.116143 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7crb6" event={"ID":"080747b0-3d43-4ff1-b21c-b8ea9fc2f961","Type":"ContainerDied","Data":"caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa"} Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.116181 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7crb6" event={"ID":"080747b0-3d43-4ff1-b21c-b8ea9fc2f961","Type":"ContainerDied","Data":"3672e1f233b40bf42b048214c1fa7e9647f6025a8a0466aed9482e60a925fb22"} Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.116185 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.118056 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "080747b0-3d43-4ff1-b21c-b8ea9fc2f961" (UID: "080747b0-3d43-4ff1-b21c-b8ea9fc2f961"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.120693 4837 generic.go:334] "Generic (PLEG): container finished" podID="e6060cf2-077e-4112-af57-f100e297f320" containerID="981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1" exitCode=0 Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.120753 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft6cr" event={"ID":"e6060cf2-077e-4112-af57-f100e297f320","Type":"ContainerDied","Data":"981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1"} Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.120758 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.120774 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft6cr" event={"ID":"e6060cf2-077e-4112-af57-f100e297f320","Type":"ContainerDied","Data":"4b6c9ae51e3fb9c4dadef31697baf0c351e16ed9f865f9be7126242388f9b2dd"} Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.130376 4837 scope.go:117] "RemoveContainer" containerID="f0462ba3a7de7e503c15d1b89ef700574f3f475b90e578e1c36542ba0d37ed51" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.143401 4837 generic.go:334] "Generic (PLEG): container finished" podID="278c91cc-2624-42cd-a35e-287e22d22f7d" containerID="e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c" exitCode=0 Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.143450 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twtbj" event={"ID":"278c91cc-2624-42cd-a35e-287e22d22f7d","Type":"ContainerDied","Data":"e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c"} Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.143482 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twtbj" event={"ID":"278c91cc-2624-42cd-a35e-287e22d22f7d","Type":"ContainerDied","Data":"c6c53bda7c5d3c5997c1cf5e6db327e83ff0de4776f4c37d442594e9111862d1"} Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.143555 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.164245 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfddm\" (UniqueName: \"kubernetes.io/projected/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-kube-api-access-wfddm\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.164276 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.164288 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.164302 4837 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8fb85cad-ec2d-4ada-bd68-55937d96a779-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.173463 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8vgmn"] Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.178925 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556714-jzzgx"] Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179581 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" containerName="extract-content" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179601 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" containerName="extract-content" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179644 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" containerName="registry-server" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179655 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" containerName="registry-server" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179662 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" containerName="extract-content" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179688 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" containerName="extract-content" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179722 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" containerName="registry-server" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179729 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" containerName="registry-server" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179736 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" containerName="extract-utilities" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179744 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" containerName="extract-utilities" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179755 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" containerName="extract-utilities" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179760 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" containerName="extract-utilities" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179770 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6060cf2-077e-4112-af57-f100e297f320" containerName="registry-server" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179801 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6060cf2-077e-4112-af57-f100e297f320" containerName="registry-server" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179808 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="278c91cc-2624-42cd-a35e-287e22d22f7d" containerName="extract-utilities" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179814 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="278c91cc-2624-42cd-a35e-287e22d22f7d" containerName="extract-utilities" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179820 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6060cf2-077e-4112-af57-f100e297f320" containerName="extract-utilities" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179827 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6060cf2-077e-4112-af57-f100e297f320" containerName="extract-utilities" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179846 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="278c91cc-2624-42cd-a35e-287e22d22f7d" containerName="registry-server" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179854 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="278c91cc-2624-42cd-a35e-287e22d22f7d" containerName="registry-server" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179894 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6060cf2-077e-4112-af57-f100e297f320" containerName="extract-content" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179904 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6060cf2-077e-4112-af57-f100e297f320" containerName="extract-content" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179912 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fb85cad-ec2d-4ada-bd68-55937d96a779" containerName="marketplace-operator" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179920 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fb85cad-ec2d-4ada-bd68-55937d96a779" containerName="marketplace-operator" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179932 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="278c91cc-2624-42cd-a35e-287e22d22f7d" containerName="extract-content" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179940 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="278c91cc-2624-42cd-a35e-287e22d22f7d" containerName="extract-content" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.180108 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" containerName="registry-server" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.180118 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6060cf2-077e-4112-af57-f100e297f320" containerName="registry-server" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.180128 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fb85cad-ec2d-4ada-bd68-55937d96a779" containerName="marketplace-operator" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.180137 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="278c91cc-2624-42cd-a35e-287e22d22f7d" containerName="registry-server" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.180145 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" containerName="registry-server" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.180668 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556714-jzzgx" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.187314 4837 scope.go:117] "RemoveContainer" containerID="96174c590656df138c1d79af4e8416815fc220b78535e35ba951e58fb70ac305" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.187497 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8vgmn"] Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.195053 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.195184 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.195251 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.198068 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556714-jzzgx"] Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.208803 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/278c91cc-2624-42cd-a35e-287e22d22f7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "278c91cc-2624-42cd-a35e-287e22d22f7d" (UID: "278c91cc-2624-42cd-a35e-287e22d22f7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.219262 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ft6cr"] Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.229852 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ft6cr"] Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.264924 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jdql\" (UniqueName: \"kubernetes.io/projected/2b7a269a-3d94-4758-922d-9886312f2a25-kube-api-access-7jdql\") pod \"auto-csr-approver-29556714-jzzgx\" (UID: \"2b7a269a-3d94-4758-922d-9886312f2a25\") " pod="openshift-infra/auto-csr-approver-29556714-jzzgx" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.265014 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/278c91cc-2624-42cd-a35e-287e22d22f7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.277876 4837 scope.go:117] "RemoveContainer" containerID="1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.282471 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0\": container with ID starting with 1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0 not found: ID does not exist" containerID="1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.282531 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0"} err="failed to get container status \"1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0\": rpc error: code = NotFound desc = could not find container \"1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0\": container with ID starting with 1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0 not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.282564 4837 scope.go:117] "RemoveContainer" containerID="f0462ba3a7de7e503c15d1b89ef700574f3f475b90e578e1c36542ba0d37ed51" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.282981 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0462ba3a7de7e503c15d1b89ef700574f3f475b90e578e1c36542ba0d37ed51\": container with ID starting with f0462ba3a7de7e503c15d1b89ef700574f3f475b90e578e1c36542ba0d37ed51 not found: ID does not exist" containerID="f0462ba3a7de7e503c15d1b89ef700574f3f475b90e578e1c36542ba0d37ed51" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.283023 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0462ba3a7de7e503c15d1b89ef700574f3f475b90e578e1c36542ba0d37ed51"} err="failed to get container status \"f0462ba3a7de7e503c15d1b89ef700574f3f475b90e578e1c36542ba0d37ed51\": rpc error: code = NotFound desc = could not find container \"f0462ba3a7de7e503c15d1b89ef700574f3f475b90e578e1c36542ba0d37ed51\": container with ID starting with f0462ba3a7de7e503c15d1b89ef700574f3f475b90e578e1c36542ba0d37ed51 not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.283052 4837 scope.go:117] "RemoveContainer" containerID="96174c590656df138c1d79af4e8416815fc220b78535e35ba951e58fb70ac305" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.283555 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96174c590656df138c1d79af4e8416815fc220b78535e35ba951e58fb70ac305\": container with ID starting with 96174c590656df138c1d79af4e8416815fc220b78535e35ba951e58fb70ac305 not found: ID does not exist" containerID="96174c590656df138c1d79af4e8416815fc220b78535e35ba951e58fb70ac305" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.283580 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96174c590656df138c1d79af4e8416815fc220b78535e35ba951e58fb70ac305"} err="failed to get container status \"96174c590656df138c1d79af4e8416815fc220b78535e35ba951e58fb70ac305\": rpc error: code = NotFound desc = could not find container \"96174c590656df138c1d79af4e8416815fc220b78535e35ba951e58fb70ac305\": container with ID starting with 96174c590656df138c1d79af4e8416815fc220b78535e35ba951e58fb70ac305 not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.283595 4837 scope.go:117] "RemoveContainer" containerID="7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.307418 4837 scope.go:117] "RemoveContainer" containerID="7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.308194 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090\": container with ID starting with 7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090 not found: ID does not exist" containerID="7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.308239 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090"} err="failed to get container status \"7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090\": rpc error: code = NotFound desc = could not find container \"7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090\": container with ID starting with 7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090 not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.308268 4837 scope.go:117] "RemoveContainer" containerID="caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.320665 4837 scope.go:117] "RemoveContainer" containerID="0d9d2068fa9a75fcedf62135c026dbc7b6be8fecbfe8ba1e1bd893e7874fe650" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.334897 4837 scope.go:117] "RemoveContainer" containerID="86b2c7193d237a632c12a22d24c63b2f2247ec49e6eadaf1bcddddc01304e298" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.351572 4837 scope.go:117] "RemoveContainer" containerID="caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.352071 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa\": container with ID starting with caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa not found: ID does not exist" containerID="caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.352109 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa"} err="failed to get container status \"caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa\": rpc error: code = NotFound desc = could not find container \"caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa\": container with ID starting with caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.352134 4837 scope.go:117] "RemoveContainer" containerID="0d9d2068fa9a75fcedf62135c026dbc7b6be8fecbfe8ba1e1bd893e7874fe650" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.353121 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d9d2068fa9a75fcedf62135c026dbc7b6be8fecbfe8ba1e1bd893e7874fe650\": container with ID starting with 0d9d2068fa9a75fcedf62135c026dbc7b6be8fecbfe8ba1e1bd893e7874fe650 not found: ID does not exist" containerID="0d9d2068fa9a75fcedf62135c026dbc7b6be8fecbfe8ba1e1bd893e7874fe650" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.353159 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d9d2068fa9a75fcedf62135c026dbc7b6be8fecbfe8ba1e1bd893e7874fe650"} err="failed to get container status \"0d9d2068fa9a75fcedf62135c026dbc7b6be8fecbfe8ba1e1bd893e7874fe650\": rpc error: code = NotFound desc = could not find container \"0d9d2068fa9a75fcedf62135c026dbc7b6be8fecbfe8ba1e1bd893e7874fe650\": container with ID starting with 0d9d2068fa9a75fcedf62135c026dbc7b6be8fecbfe8ba1e1bd893e7874fe650 not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.353392 4837 scope.go:117] "RemoveContainer" containerID="86b2c7193d237a632c12a22d24c63b2f2247ec49e6eadaf1bcddddc01304e298" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.355099 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86b2c7193d237a632c12a22d24c63b2f2247ec49e6eadaf1bcddddc01304e298\": container with ID starting with 86b2c7193d237a632c12a22d24c63b2f2247ec49e6eadaf1bcddddc01304e298 not found: ID does not exist" containerID="86b2c7193d237a632c12a22d24c63b2f2247ec49e6eadaf1bcddddc01304e298" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.355147 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86b2c7193d237a632c12a22d24c63b2f2247ec49e6eadaf1bcddddc01304e298"} err="failed to get container status \"86b2c7193d237a632c12a22d24c63b2f2247ec49e6eadaf1bcddddc01304e298\": rpc error: code = NotFound desc = could not find container \"86b2c7193d237a632c12a22d24c63b2f2247ec49e6eadaf1bcddddc01304e298\": container with ID starting with 86b2c7193d237a632c12a22d24c63b2f2247ec49e6eadaf1bcddddc01304e298 not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.355184 4837 scope.go:117] "RemoveContainer" containerID="981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.366246 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jdql\" (UniqueName: \"kubernetes.io/projected/2b7a269a-3d94-4758-922d-9886312f2a25-kube-api-access-7jdql\") pod \"auto-csr-approver-29556714-jzzgx\" (UID: \"2b7a269a-3d94-4758-922d-9886312f2a25\") " pod="openshift-infra/auto-csr-approver-29556714-jzzgx" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.378709 4837 scope.go:117] "RemoveContainer" containerID="18157b9c0c2686c96644926d7ef6b55f8879075870dcef9cd6f8b2f09be008ae" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.384589 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jdql\" (UniqueName: \"kubernetes.io/projected/2b7a269a-3d94-4758-922d-9886312f2a25-kube-api-access-7jdql\") pod \"auto-csr-approver-29556714-jzzgx\" (UID: \"2b7a269a-3d94-4758-922d-9886312f2a25\") " pod="openshift-infra/auto-csr-approver-29556714-jzzgx" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.409018 4837 scope.go:117] "RemoveContainer" containerID="92a2c4e8c63e772dff31e06b47098a1634c8714cf4402cee4344a939f71bc1a7" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.422800 4837 scope.go:117] "RemoveContainer" containerID="981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.424002 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1\": container with ID starting with 981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1 not found: ID does not exist" containerID="981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.424037 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1"} err="failed to get container status \"981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1\": rpc error: code = NotFound desc = could not find container \"981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1\": container with ID starting with 981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1 not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.424056 4837 scope.go:117] "RemoveContainer" containerID="18157b9c0c2686c96644926d7ef6b55f8879075870dcef9cd6f8b2f09be008ae" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.424337 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18157b9c0c2686c96644926d7ef6b55f8879075870dcef9cd6f8b2f09be008ae\": container with ID starting with 18157b9c0c2686c96644926d7ef6b55f8879075870dcef9cd6f8b2f09be008ae not found: ID does not exist" containerID="18157b9c0c2686c96644926d7ef6b55f8879075870dcef9cd6f8b2f09be008ae" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.424373 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18157b9c0c2686c96644926d7ef6b55f8879075870dcef9cd6f8b2f09be008ae"} err="failed to get container status \"18157b9c0c2686c96644926d7ef6b55f8879075870dcef9cd6f8b2f09be008ae\": rpc error: code = NotFound desc = could not find container \"18157b9c0c2686c96644926d7ef6b55f8879075870dcef9cd6f8b2f09be008ae\": container with ID starting with 18157b9c0c2686c96644926d7ef6b55f8879075870dcef9cd6f8b2f09be008ae not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.424389 4837 scope.go:117] "RemoveContainer" containerID="92a2c4e8c63e772dff31e06b47098a1634c8714cf4402cee4344a939f71bc1a7" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.424710 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92a2c4e8c63e772dff31e06b47098a1634c8714cf4402cee4344a939f71bc1a7\": container with ID starting with 92a2c4e8c63e772dff31e06b47098a1634c8714cf4402cee4344a939f71bc1a7 not found: ID does not exist" containerID="92a2c4e8c63e772dff31e06b47098a1634c8714cf4402cee4344a939f71bc1a7" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.424737 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92a2c4e8c63e772dff31e06b47098a1634c8714cf4402cee4344a939f71bc1a7"} err="failed to get container status \"92a2c4e8c63e772dff31e06b47098a1634c8714cf4402cee4344a939f71bc1a7\": rpc error: code = NotFound desc = could not find container \"92a2c4e8c63e772dff31e06b47098a1634c8714cf4402cee4344a939f71bc1a7\": container with ID starting with 92a2c4e8c63e772dff31e06b47098a1634c8714cf4402cee4344a939f71bc1a7 not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.424755 4837 scope.go:117] "RemoveContainer" containerID="e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.430164 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ng6kk"] Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.435868 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ng6kk"] Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.444062 4837 scope.go:117] "RemoveContainer" containerID="a7440d1435344c8bb57482e03b8adcb5ddb52bcac5f7b4d1ca42e60d0e6e91e3" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.451805 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7crb6"] Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.456765 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7crb6"] Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.461431 4837 scope.go:117] "RemoveContainer" containerID="2e92979b20e46c7135bec8322dc3792b24e0a0cbeac503bda1fa03e0621168f3" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.469179 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-twtbj"] Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.474844 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-twtbj"] Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.491815 4837 scope.go:117] "RemoveContainer" containerID="e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.492313 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c\": container with ID starting with e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c not found: ID does not exist" containerID="e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.492353 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c"} err="failed to get container status \"e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c\": rpc error: code = NotFound desc = could not find container \"e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c\": container with ID starting with e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.492379 4837 scope.go:117] "RemoveContainer" containerID="a7440d1435344c8bb57482e03b8adcb5ddb52bcac5f7b4d1ca42e60d0e6e91e3" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.492974 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7440d1435344c8bb57482e03b8adcb5ddb52bcac5f7b4d1ca42e60d0e6e91e3\": container with ID starting with a7440d1435344c8bb57482e03b8adcb5ddb52bcac5f7b4d1ca42e60d0e6e91e3 not found: ID does not exist" containerID="a7440d1435344c8bb57482e03b8adcb5ddb52bcac5f7b4d1ca42e60d0e6e91e3" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.493025 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7440d1435344c8bb57482e03b8adcb5ddb52bcac5f7b4d1ca42e60d0e6e91e3"} err="failed to get container status \"a7440d1435344c8bb57482e03b8adcb5ddb52bcac5f7b4d1ca42e60d0e6e91e3\": rpc error: code = NotFound desc = could not find container \"a7440d1435344c8bb57482e03b8adcb5ddb52bcac5f7b4d1ca42e60d0e6e91e3\": container with ID starting with a7440d1435344c8bb57482e03b8adcb5ddb52bcac5f7b4d1ca42e60d0e6e91e3 not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.493054 4837 scope.go:117] "RemoveContainer" containerID="2e92979b20e46c7135bec8322dc3792b24e0a0cbeac503bda1fa03e0621168f3" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.493316 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e92979b20e46c7135bec8322dc3792b24e0a0cbeac503bda1fa03e0621168f3\": container with ID starting with 2e92979b20e46c7135bec8322dc3792b24e0a0cbeac503bda1fa03e0621168f3 not found: ID does not exist" containerID="2e92979b20e46c7135bec8322dc3792b24e0a0cbeac503bda1fa03e0621168f3" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.493347 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e92979b20e46c7135bec8322dc3792b24e0a0cbeac503bda1fa03e0621168f3"} err="failed to get container status \"2e92979b20e46c7135bec8322dc3792b24e0a0cbeac503bda1fa03e0621168f3\": rpc error: code = NotFound desc = could not find container \"2e92979b20e46c7135bec8322dc3792b24e0a0cbeac503bda1fa03e0621168f3\": container with ID starting with 2e92979b20e46c7135bec8322dc3792b24e0a0cbeac503bda1fa03e0621168f3 not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.529057 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556714-jzzgx" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.947865 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556714-jzzgx"] Mar 13 11:54:00 crc kubenswrapper[4837]: W0313 11:54:00.953351 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b7a269a_3d94_4758_922d_9886312f2a25.slice/crio-f1d46501ec1fa73334626e53641f933a075eca354c4c9f492c76bbcb4a084039 WatchSource:0}: Error finding container f1d46501ec1fa73334626e53641f933a075eca354c4c9f492c76bbcb4a084039: Status 404 returned error can't find the container with id f1d46501ec1fa73334626e53641f933a075eca354c4c9f492c76bbcb4a084039 Mar 13 11:54:01 crc kubenswrapper[4837]: I0313 11:54:01.058389 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" path="/var/lib/kubelet/pods/080747b0-3d43-4ff1-b21c-b8ea9fc2f961/volumes" Mar 13 11:54:01 crc kubenswrapper[4837]: I0313 11:54:01.059406 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="278c91cc-2624-42cd-a35e-287e22d22f7d" path="/var/lib/kubelet/pods/278c91cc-2624-42cd-a35e-287e22d22f7d/volumes" Mar 13 11:54:01 crc kubenswrapper[4837]: I0313 11:54:01.060174 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fb85cad-ec2d-4ada-bd68-55937d96a779" path="/var/lib/kubelet/pods/8fb85cad-ec2d-4ada-bd68-55937d96a779/volumes" Mar 13 11:54:01 crc kubenswrapper[4837]: I0313 11:54:01.061092 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" path="/var/lib/kubelet/pods/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d/volumes" Mar 13 11:54:01 crc kubenswrapper[4837]: I0313 11:54:01.061685 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6060cf2-077e-4112-af57-f100e297f320" path="/var/lib/kubelet/pods/e6060cf2-077e-4112-af57-f100e297f320/volumes" Mar 13 11:54:01 crc kubenswrapper[4837]: I0313 11:54:01.149587 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556714-jzzgx" event={"ID":"2b7a269a-3d94-4758-922d-9886312f2a25","Type":"ContainerStarted","Data":"f1d46501ec1fa73334626e53641f933a075eca354c4c9f492c76bbcb4a084039"} Mar 13 11:54:01 crc kubenswrapper[4837]: I0313 11:54:01.150934 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" event={"ID":"b87c8f86-a346-4907-9441-048c3220646f","Type":"ContainerStarted","Data":"469523a6f85ae1746666577053ce1be84d9142d1935d58c9331cba700d12263d"} Mar 13 11:54:01 crc kubenswrapper[4837]: I0313 11:54:01.151260 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:54:01 crc kubenswrapper[4837]: I0313 11:54:01.154669 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:54:01 crc kubenswrapper[4837]: I0313 11:54:01.170104 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" podStartSLOduration=2.170085533 podStartE2EDuration="2.170085533s" podCreationTimestamp="2026-03-13 11:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:54:01.165291687 +0000 UTC m=+356.803558450" watchObservedRunningTime="2026-03-13 11:54:01.170085533 +0000 UTC m=+356.808352286" Mar 13 11:54:02 crc kubenswrapper[4837]: I0313 11:54:02.164773 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556714-jzzgx" event={"ID":"2b7a269a-3d94-4758-922d-9886312f2a25","Type":"ContainerStarted","Data":"35377d4210b529c8401b806fa107dba5beb6002cbc3a3ce3ea9ad22bd10d0960"} Mar 13 11:54:02 crc kubenswrapper[4837]: I0313 11:54:02.176369 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556714-jzzgx" podStartSLOduration=1.372039008 podStartE2EDuration="2.176348919s" podCreationTimestamp="2026-03-13 11:54:00 +0000 UTC" firstStartedPulling="2026-03-13 11:54:00.958314908 +0000 UTC m=+356.596581681" lastFinishedPulling="2026-03-13 11:54:01.762624829 +0000 UTC m=+357.400891592" observedRunningTime="2026-03-13 11:54:02.175548483 +0000 UTC m=+357.813815246" watchObservedRunningTime="2026-03-13 11:54:02.176348919 +0000 UTC m=+357.814615682" Mar 13 11:54:03 crc kubenswrapper[4837]: I0313 11:54:03.172600 4837 generic.go:334] "Generic (PLEG): container finished" podID="2b7a269a-3d94-4758-922d-9886312f2a25" containerID="35377d4210b529c8401b806fa107dba5beb6002cbc3a3ce3ea9ad22bd10d0960" exitCode=0 Mar 13 11:54:03 crc kubenswrapper[4837]: I0313 11:54:03.172694 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556714-jzzgx" event={"ID":"2b7a269a-3d94-4758-922d-9886312f2a25","Type":"ContainerDied","Data":"35377d4210b529c8401b806fa107dba5beb6002cbc3a3ce3ea9ad22bd10d0960"} Mar 13 11:54:04 crc kubenswrapper[4837]: I0313 11:54:04.522901 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556714-jzzgx" Mar 13 11:54:04 crc kubenswrapper[4837]: I0313 11:54:04.648386 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jdql\" (UniqueName: \"kubernetes.io/projected/2b7a269a-3d94-4758-922d-9886312f2a25-kube-api-access-7jdql\") pod \"2b7a269a-3d94-4758-922d-9886312f2a25\" (UID: \"2b7a269a-3d94-4758-922d-9886312f2a25\") " Mar 13 11:54:04 crc kubenswrapper[4837]: I0313 11:54:04.653270 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b7a269a-3d94-4758-922d-9886312f2a25-kube-api-access-7jdql" (OuterVolumeSpecName: "kube-api-access-7jdql") pod "2b7a269a-3d94-4758-922d-9886312f2a25" (UID: "2b7a269a-3d94-4758-922d-9886312f2a25"). InnerVolumeSpecName "kube-api-access-7jdql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:54:04 crc kubenswrapper[4837]: I0313 11:54:04.749268 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jdql\" (UniqueName: \"kubernetes.io/projected/2b7a269a-3d94-4758-922d-9886312f2a25-kube-api-access-7jdql\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:05 crc kubenswrapper[4837]: I0313 11:54:05.188334 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556714-jzzgx" event={"ID":"2b7a269a-3d94-4758-922d-9886312f2a25","Type":"ContainerDied","Data":"f1d46501ec1fa73334626e53641f933a075eca354c4c9f492c76bbcb4a084039"} Mar 13 11:54:05 crc kubenswrapper[4837]: I0313 11:54:05.188399 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1d46501ec1fa73334626e53641f933a075eca354c4c9f492c76bbcb4a084039" Mar 13 11:54:05 crc kubenswrapper[4837]: I0313 11:54:05.188441 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556714-jzzgx" Mar 13 11:54:17 crc kubenswrapper[4837]: I0313 11:54:17.996997 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8q6j6"] Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.022002 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" podUID="27d45de2-e0ab-4c3e-b3da-b20e60e26801" containerName="oauth-openshift" containerID="cri-o://7788f0babcbd0ba3005289dc42abd3560a56f1f0efe57b0376342454820793c4" gracePeriod=15 Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.407790 4837 generic.go:334] "Generic (PLEG): container finished" podID="27d45de2-e0ab-4c3e-b3da-b20e60e26801" containerID="7788f0babcbd0ba3005289dc42abd3560a56f1f0efe57b0376342454820793c4" exitCode=0 Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.407876 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" event={"ID":"27d45de2-e0ab-4c3e-b3da-b20e60e26801","Type":"ContainerDied","Data":"7788f0babcbd0ba3005289dc42abd3560a56f1f0efe57b0376342454820793c4"} Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.408196 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" event={"ID":"27d45de2-e0ab-4c3e-b3da-b20e60e26801","Type":"ContainerDied","Data":"48f88856d0aa99c22451af4774004c789a7baf644ed71ee96a301b56c7368078"} Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.408209 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48f88856d0aa99c22451af4774004c789a7baf644ed71ee96a301b56c7368078" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.423515 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.458977 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-576c48cff9-gb95w"] Mar 13 11:54:43 crc kubenswrapper[4837]: E0313 11:54:43.459230 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b7a269a-3d94-4758-922d-9886312f2a25" containerName="oc" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.459251 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7a269a-3d94-4758-922d-9886312f2a25" containerName="oc" Mar 13 11:54:43 crc kubenswrapper[4837]: E0313 11:54:43.459469 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27d45de2-e0ab-4c3e-b3da-b20e60e26801" containerName="oauth-openshift" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.459673 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="27d45de2-e0ab-4c3e-b3da-b20e60e26801" containerName="oauth-openshift" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.459863 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="27d45de2-e0ab-4c3e-b3da-b20e60e26801" containerName="oauth-openshift" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.459892 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b7a269a-3d94-4758-922d-9886312f2a25" containerName="oc" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.460359 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.465009 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-576c48cff9-gb95w"] Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505351 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-cliconfig\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505414 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-idp-0-file-data\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505472 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-service-ca\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505499 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-error\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505529 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-audit-policies\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505559 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-ocp-branding-template\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505593 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-login\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505674 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-serving-cert\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505702 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-provider-selection\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505736 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zn764\" (UniqueName: \"kubernetes.io/projected/27d45de2-e0ab-4c3e-b3da-b20e60e26801-kube-api-access-zn764\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505771 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-session\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505794 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27d45de2-e0ab-4c3e-b3da-b20e60e26801-audit-dir\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505822 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-router-certs\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505860 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-trusted-ca-bundle\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506054 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-router-certs\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506107 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-user-template-error\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506137 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10374b6c-b203-46d7-856b-ca95bb2f19a7-audit-dir\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506175 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-session\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506206 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-service-ca\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506232 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-user-template-login\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506248 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506259 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/10374b6c-b203-46d7-856b-ca95bb2f19a7-audit-policies\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506318 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506323 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506343 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506421 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506462 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506467 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506490 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9grzq\" (UniqueName: \"kubernetes.io/projected/10374b6c-b203-46d7-856b-ca95bb2f19a7-kube-api-access-9grzq\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506625 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506667 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506685 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506851 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506864 4837 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506883 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506900 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506697 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/27d45de2-e0ab-4c3e-b3da-b20e60e26801-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.512848 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.514211 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.514399 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.515042 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.515835 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.515993 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.516673 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.517687 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.525622 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27d45de2-e0ab-4c3e-b3da-b20e60e26801-kube-api-access-zn764" (OuterVolumeSpecName: "kube-api-access-zn764") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "kube-api-access-zn764". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.608014 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-user-template-error\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.608278 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10374b6c-b203-46d7-856b-ca95bb2f19a7-audit-dir\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.608371 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-session\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.608464 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-service-ca\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.608566 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-user-template-login\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.608803 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/10374b6c-b203-46d7-856b-ca95bb2f19a7-audit-policies\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.608362 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10374b6c-b203-46d7-856b-ca95bb2f19a7-audit-dir\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.608891 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609165 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609219 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609262 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609288 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9grzq\" (UniqueName: \"kubernetes.io/projected/10374b6c-b203-46d7-856b-ca95bb2f19a7-kube-api-access-9grzq\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609329 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609352 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609407 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-router-certs\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609473 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609491 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609502 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609514 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609526 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zn764\" (UniqueName: \"kubernetes.io/projected/27d45de2-e0ab-4c3e-b3da-b20e60e26801-kube-api-access-zn764\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609536 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609548 4837 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27d45de2-e0ab-4c3e-b3da-b20e60e26801-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609557 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609568 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609577 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609220 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-service-ca\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.610024 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.610068 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.610165 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/10374b6c-b203-46d7-856b-ca95bb2f19a7-audit-policies\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.611325 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-user-template-error\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.612898 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-router-certs\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.613042 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-user-template-login\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.613214 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.613251 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.613738 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-session\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.614579 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.626950 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.629354 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9grzq\" (UniqueName: \"kubernetes.io/projected/10374b6c-b203-46d7-856b-ca95bb2f19a7-kube-api-access-9grzq\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.776559 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.211062 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-576c48cff9-gb95w"] Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.366612 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tjvc6"] Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.368337 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.371140 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.382808 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tjvc6"] Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.414481 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" event={"ID":"10374b6c-b203-46d7-856b-ca95bb2f19a7","Type":"ContainerStarted","Data":"b9ac8d05b9f1c86c2c199c32721944cbd6e9fdc505e7326b2ec5798ccc5c9882"} Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.414509 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.417548 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t5r4\" (UniqueName: \"kubernetes.io/projected/07889497-1048-4f7a-9245-132767bb28b6-kube-api-access-9t5r4\") pod \"community-operators-tjvc6\" (UID: \"07889497-1048-4f7a-9245-132767bb28b6\") " pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.417829 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07889497-1048-4f7a-9245-132767bb28b6-catalog-content\") pod \"community-operators-tjvc6\" (UID: \"07889497-1048-4f7a-9245-132767bb28b6\") " pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.417953 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07889497-1048-4f7a-9245-132767bb28b6-utilities\") pod \"community-operators-tjvc6\" (UID: \"07889497-1048-4f7a-9245-132767bb28b6\") " pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.446205 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8q6j6"] Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.449112 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8q6j6"] Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.519352 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t5r4\" (UniqueName: \"kubernetes.io/projected/07889497-1048-4f7a-9245-132767bb28b6-kube-api-access-9t5r4\") pod \"community-operators-tjvc6\" (UID: \"07889497-1048-4f7a-9245-132767bb28b6\") " pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.519807 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07889497-1048-4f7a-9245-132767bb28b6-catalog-content\") pod \"community-operators-tjvc6\" (UID: \"07889497-1048-4f7a-9245-132767bb28b6\") " pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.519906 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07889497-1048-4f7a-9245-132767bb28b6-utilities\") pod \"community-operators-tjvc6\" (UID: \"07889497-1048-4f7a-9245-132767bb28b6\") " pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.520624 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07889497-1048-4f7a-9245-132767bb28b6-catalog-content\") pod \"community-operators-tjvc6\" (UID: \"07889497-1048-4f7a-9245-132767bb28b6\") " pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.520803 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07889497-1048-4f7a-9245-132767bb28b6-utilities\") pod \"community-operators-tjvc6\" (UID: \"07889497-1048-4f7a-9245-132767bb28b6\") " pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.538574 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t5r4\" (UniqueName: \"kubernetes.io/projected/07889497-1048-4f7a-9245-132767bb28b6-kube-api-access-9t5r4\") pod \"community-operators-tjvc6\" (UID: \"07889497-1048-4f7a-9245-132767bb28b6\") " pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.725492 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.055050 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27d45de2-e0ab-4c3e-b3da-b20e60e26801" path="/var/lib/kubelet/pods/27d45de2-e0ab-4c3e-b3da-b20e60e26801/volumes" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.169983 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tjvc6"] Mar 13 11:54:45 crc kubenswrapper[4837]: W0313 11:54:45.178148 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07889497_1048_4f7a_9245_132767bb28b6.slice/crio-88809656a4007e3a80b8ce9dec903bad48ed23e1bf04235cdee5afb969a57ed8 WatchSource:0}: Error finding container 88809656a4007e3a80b8ce9dec903bad48ed23e1bf04235cdee5afb969a57ed8: Status 404 returned error can't find the container with id 88809656a4007e3a80b8ce9dec903bad48ed23e1bf04235cdee5afb969a57ed8 Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.422864 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" event={"ID":"10374b6c-b203-46d7-856b-ca95bb2f19a7","Type":"ContainerStarted","Data":"8f25fb8bd0e85a8245bb09f66b22b04c88fd4b7f814a4f8e7416ee0bd04e0d77"} Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.424785 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.426401 4837 generic.go:334] "Generic (PLEG): container finished" podID="07889497-1048-4f7a-9245-132767bb28b6" containerID="d65091f8ad9f264991fc060bd7f7b7cc92043cbc5ac816b835141daaa1a15860" exitCode=0 Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.426464 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjvc6" event={"ID":"07889497-1048-4f7a-9245-132767bb28b6","Type":"ContainerDied","Data":"d65091f8ad9f264991fc060bd7f7b7cc92043cbc5ac816b835141daaa1a15860"} Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.426534 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjvc6" event={"ID":"07889497-1048-4f7a-9245-132767bb28b6","Type":"ContainerStarted","Data":"88809656a4007e3a80b8ce9dec903bad48ed23e1bf04235cdee5afb969a57ed8"} Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.435512 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.455011 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" podStartSLOduration=27.454991948 podStartE2EDuration="27.454991948s" podCreationTimestamp="2026-03-13 11:54:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:54:45.452325453 +0000 UTC m=+401.090592246" watchObservedRunningTime="2026-03-13 11:54:45.454991948 +0000 UTC m=+401.093258711" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.759162 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kpp2z"] Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.760189 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.761958 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.774278 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kpp2z"] Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.839455 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d96905d-521e-4ab9-87a8-d6edd0c027ed-utilities\") pod \"redhat-operators-kpp2z\" (UID: \"8d96905d-521e-4ab9-87a8-d6edd0c027ed\") " pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.839689 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d96905d-521e-4ab9-87a8-d6edd0c027ed-catalog-content\") pod \"redhat-operators-kpp2z\" (UID: \"8d96905d-521e-4ab9-87a8-d6edd0c027ed\") " pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.839774 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgw6h\" (UniqueName: \"kubernetes.io/projected/8d96905d-521e-4ab9-87a8-d6edd0c027ed-kube-api-access-hgw6h\") pod \"redhat-operators-kpp2z\" (UID: \"8d96905d-521e-4ab9-87a8-d6edd0c027ed\") " pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.941894 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d96905d-521e-4ab9-87a8-d6edd0c027ed-catalog-content\") pod \"redhat-operators-kpp2z\" (UID: \"8d96905d-521e-4ab9-87a8-d6edd0c027ed\") " pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.942087 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgw6h\" (UniqueName: \"kubernetes.io/projected/8d96905d-521e-4ab9-87a8-d6edd0c027ed-kube-api-access-hgw6h\") pod \"redhat-operators-kpp2z\" (UID: \"8d96905d-521e-4ab9-87a8-d6edd0c027ed\") " pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.942288 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d96905d-521e-4ab9-87a8-d6edd0c027ed-utilities\") pod \"redhat-operators-kpp2z\" (UID: \"8d96905d-521e-4ab9-87a8-d6edd0c027ed\") " pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.942474 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d96905d-521e-4ab9-87a8-d6edd0c027ed-catalog-content\") pod \"redhat-operators-kpp2z\" (UID: \"8d96905d-521e-4ab9-87a8-d6edd0c027ed\") " pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.942776 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d96905d-521e-4ab9-87a8-d6edd0c027ed-utilities\") pod \"redhat-operators-kpp2z\" (UID: \"8d96905d-521e-4ab9-87a8-d6edd0c027ed\") " pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.962464 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgw6h\" (UniqueName: \"kubernetes.io/projected/8d96905d-521e-4ab9-87a8-d6edd0c027ed-kube-api-access-hgw6h\") pod \"redhat-operators-kpp2z\" (UID: \"8d96905d-521e-4ab9-87a8-d6edd0c027ed\") " pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:46 crc kubenswrapper[4837]: I0313 11:54:46.082127 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:46 crc kubenswrapper[4837]: I0313 11:54:46.434622 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjvc6" event={"ID":"07889497-1048-4f7a-9245-132767bb28b6","Type":"ContainerStarted","Data":"f302494fedb15539b050a28041be2f2f75279a64dd9207158b12d8d4c083cf2d"} Mar 13 11:54:46 crc kubenswrapper[4837]: I0313 11:54:46.489763 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kpp2z"] Mar 13 11:54:46 crc kubenswrapper[4837]: W0313 11:54:46.533001 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d96905d_521e_4ab9_87a8_d6edd0c027ed.slice/crio-4b96d844ae19ddd09e809c479227559e4835154b1a463a235e51a5a8275afc65 WatchSource:0}: Error finding container 4b96d844ae19ddd09e809c479227559e4835154b1a463a235e51a5a8275afc65: Status 404 returned error can't find the container with id 4b96d844ae19ddd09e809c479227559e4835154b1a463a235e51a5a8275afc65 Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.158916 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zckjb"] Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.160086 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.162018 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.169034 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zckjb"] Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.267168 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj6zs\" (UniqueName: \"kubernetes.io/projected/4298f221-fd11-49a1-a0e9-6f95dbdedc44-kube-api-access-xj6zs\") pod \"certified-operators-zckjb\" (UID: \"4298f221-fd11-49a1-a0e9-6f95dbdedc44\") " pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.267256 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4298f221-fd11-49a1-a0e9-6f95dbdedc44-utilities\") pod \"certified-operators-zckjb\" (UID: \"4298f221-fd11-49a1-a0e9-6f95dbdedc44\") " pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.267564 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4298f221-fd11-49a1-a0e9-6f95dbdedc44-catalog-content\") pod \"certified-operators-zckjb\" (UID: \"4298f221-fd11-49a1-a0e9-6f95dbdedc44\") " pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.368737 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4298f221-fd11-49a1-a0e9-6f95dbdedc44-catalog-content\") pod \"certified-operators-zckjb\" (UID: \"4298f221-fd11-49a1-a0e9-6f95dbdedc44\") " pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.368821 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj6zs\" (UniqueName: \"kubernetes.io/projected/4298f221-fd11-49a1-a0e9-6f95dbdedc44-kube-api-access-xj6zs\") pod \"certified-operators-zckjb\" (UID: \"4298f221-fd11-49a1-a0e9-6f95dbdedc44\") " pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.368877 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4298f221-fd11-49a1-a0e9-6f95dbdedc44-utilities\") pod \"certified-operators-zckjb\" (UID: \"4298f221-fd11-49a1-a0e9-6f95dbdedc44\") " pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.369443 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4298f221-fd11-49a1-a0e9-6f95dbdedc44-catalog-content\") pod \"certified-operators-zckjb\" (UID: \"4298f221-fd11-49a1-a0e9-6f95dbdedc44\") " pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.369591 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4298f221-fd11-49a1-a0e9-6f95dbdedc44-utilities\") pod \"certified-operators-zckjb\" (UID: \"4298f221-fd11-49a1-a0e9-6f95dbdedc44\") " pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.393563 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj6zs\" (UniqueName: \"kubernetes.io/projected/4298f221-fd11-49a1-a0e9-6f95dbdedc44-kube-api-access-xj6zs\") pod \"certified-operators-zckjb\" (UID: \"4298f221-fd11-49a1-a0e9-6f95dbdedc44\") " pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.440562 4837 generic.go:334] "Generic (PLEG): container finished" podID="8d96905d-521e-4ab9-87a8-d6edd0c027ed" containerID="21d402f48e20199c93f215f89accb38b25a12a6308104cc7886b3e3df832e271" exitCode=0 Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.440696 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpp2z" event={"ID":"8d96905d-521e-4ab9-87a8-d6edd0c027ed","Type":"ContainerDied","Data":"21d402f48e20199c93f215f89accb38b25a12a6308104cc7886b3e3df832e271"} Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.440805 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpp2z" event={"ID":"8d96905d-521e-4ab9-87a8-d6edd0c027ed","Type":"ContainerStarted","Data":"4b96d844ae19ddd09e809c479227559e4835154b1a463a235e51a5a8275afc65"} Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.442540 4837 generic.go:334] "Generic (PLEG): container finished" podID="07889497-1048-4f7a-9245-132767bb28b6" containerID="f302494fedb15539b050a28041be2f2f75279a64dd9207158b12d8d4c083cf2d" exitCode=0 Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.442627 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjvc6" event={"ID":"07889497-1048-4f7a-9245-132767bb28b6","Type":"ContainerDied","Data":"f302494fedb15539b050a28041be2f2f75279a64dd9207158b12d8d4c083cf2d"} Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.477884 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.870630 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zckjb"] Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.156927 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m5v5n"] Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.157950 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.160282 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.171048 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5v5n"] Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.183106 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fec78503-41e5-45f4-9217-1debe55ec107-utilities\") pod \"redhat-marketplace-m5v5n\" (UID: \"fec78503-41e5-45f4-9217-1debe55ec107\") " pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.183158 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fec78503-41e5-45f4-9217-1debe55ec107-catalog-content\") pod \"redhat-marketplace-m5v5n\" (UID: \"fec78503-41e5-45f4-9217-1debe55ec107\") " pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.183227 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2xqh\" (UniqueName: \"kubernetes.io/projected/fec78503-41e5-45f4-9217-1debe55ec107-kube-api-access-w2xqh\") pod \"redhat-marketplace-m5v5n\" (UID: \"fec78503-41e5-45f4-9217-1debe55ec107\") " pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.283881 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fec78503-41e5-45f4-9217-1debe55ec107-catalog-content\") pod \"redhat-marketplace-m5v5n\" (UID: \"fec78503-41e5-45f4-9217-1debe55ec107\") " pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.283934 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2xqh\" (UniqueName: \"kubernetes.io/projected/fec78503-41e5-45f4-9217-1debe55ec107-kube-api-access-w2xqh\") pod \"redhat-marketplace-m5v5n\" (UID: \"fec78503-41e5-45f4-9217-1debe55ec107\") " pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.284015 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fec78503-41e5-45f4-9217-1debe55ec107-utilities\") pod \"redhat-marketplace-m5v5n\" (UID: \"fec78503-41e5-45f4-9217-1debe55ec107\") " pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.284776 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fec78503-41e5-45f4-9217-1debe55ec107-catalog-content\") pod \"redhat-marketplace-m5v5n\" (UID: \"fec78503-41e5-45f4-9217-1debe55ec107\") " pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.284797 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fec78503-41e5-45f4-9217-1debe55ec107-utilities\") pod \"redhat-marketplace-m5v5n\" (UID: \"fec78503-41e5-45f4-9217-1debe55ec107\") " pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.304005 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2xqh\" (UniqueName: \"kubernetes.io/projected/fec78503-41e5-45f4-9217-1debe55ec107-kube-api-access-w2xqh\") pod \"redhat-marketplace-m5v5n\" (UID: \"fec78503-41e5-45f4-9217-1debe55ec107\") " pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.452114 4837 generic.go:334] "Generic (PLEG): container finished" podID="4298f221-fd11-49a1-a0e9-6f95dbdedc44" containerID="18e61808bcae466833085d24179cdda44de2a637db7884a1b4abbd72b382f4a5" exitCode=0 Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.452185 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zckjb" event={"ID":"4298f221-fd11-49a1-a0e9-6f95dbdedc44","Type":"ContainerDied","Data":"18e61808bcae466833085d24179cdda44de2a637db7884a1b4abbd72b382f4a5"} Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.452587 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zckjb" event={"ID":"4298f221-fd11-49a1-a0e9-6f95dbdedc44","Type":"ContainerStarted","Data":"786b091b8fbd379c50d6730e052c15a6bae184fc90f99430dc54b0aa7193c96c"} Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.456312 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpp2z" event={"ID":"8d96905d-521e-4ab9-87a8-d6edd0c027ed","Type":"ContainerStarted","Data":"c2579f686b8be02a64326106fec11a117b09f60501908a0ca1df4ea63ca94522"} Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.461940 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjvc6" event={"ID":"07889497-1048-4f7a-9245-132767bb28b6","Type":"ContainerStarted","Data":"6dc4460a0d460c82f61640b3c5c6c53eac6f4b5becc2eae019cec7a28347f2bd"} Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.480121 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.519256 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tjvc6" podStartSLOduration=1.8924836040000002 podStartE2EDuration="4.519237885s" podCreationTimestamp="2026-03-13 11:54:44 +0000 UTC" firstStartedPulling="2026-03-13 11:54:45.42835796 +0000 UTC m=+401.066624723" lastFinishedPulling="2026-03-13 11:54:48.055112241 +0000 UTC m=+403.693379004" observedRunningTime="2026-03-13 11:54:48.519029789 +0000 UTC m=+404.157296552" watchObservedRunningTime="2026-03-13 11:54:48.519237885 +0000 UTC m=+404.157504648" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.953043 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5v5n"] Mar 13 11:54:49 crc kubenswrapper[4837]: I0313 11:54:49.469078 4837 generic.go:334] "Generic (PLEG): container finished" podID="8d96905d-521e-4ab9-87a8-d6edd0c027ed" containerID="c2579f686b8be02a64326106fec11a117b09f60501908a0ca1df4ea63ca94522" exitCode=0 Mar 13 11:54:49 crc kubenswrapper[4837]: I0313 11:54:49.469195 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpp2z" event={"ID":"8d96905d-521e-4ab9-87a8-d6edd0c027ed","Type":"ContainerDied","Data":"c2579f686b8be02a64326106fec11a117b09f60501908a0ca1df4ea63ca94522"} Mar 13 11:54:49 crc kubenswrapper[4837]: I0313 11:54:49.472946 4837 generic.go:334] "Generic (PLEG): container finished" podID="fec78503-41e5-45f4-9217-1debe55ec107" containerID="e1853ac25f407645157ae4b773160c6c162ac692f7690b340b59353f7a9f34a9" exitCode=0 Mar 13 11:54:49 crc kubenswrapper[4837]: I0313 11:54:49.472990 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5v5n" event={"ID":"fec78503-41e5-45f4-9217-1debe55ec107","Type":"ContainerDied","Data":"e1853ac25f407645157ae4b773160c6c162ac692f7690b340b59353f7a9f34a9"} Mar 13 11:54:49 crc kubenswrapper[4837]: I0313 11:54:49.473030 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5v5n" event={"ID":"fec78503-41e5-45f4-9217-1debe55ec107","Type":"ContainerStarted","Data":"5e9f2d804e0645269d40b445a2e8eaa9cf543ef4b51c2e12b9e3e4addefc241c"} Mar 13 11:54:49 crc kubenswrapper[4837]: I0313 11:54:49.476374 4837 generic.go:334] "Generic (PLEG): container finished" podID="4298f221-fd11-49a1-a0e9-6f95dbdedc44" containerID="13331e95c57b7e12e7a85ec5552e0f4233276b481c6c04bd1f77be9de05f66ec" exitCode=0 Mar 13 11:54:49 crc kubenswrapper[4837]: I0313 11:54:49.476479 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zckjb" event={"ID":"4298f221-fd11-49a1-a0e9-6f95dbdedc44","Type":"ContainerDied","Data":"13331e95c57b7e12e7a85ec5552e0f4233276b481c6c04bd1f77be9de05f66ec"} Mar 13 11:54:50 crc kubenswrapper[4837]: I0313 11:54:50.483833 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpp2z" event={"ID":"8d96905d-521e-4ab9-87a8-d6edd0c027ed","Type":"ContainerStarted","Data":"4be2ec40a620706421a0a4c0f49c8c79e68837d9d8e70c2a0546a05222c9171c"} Mar 13 11:54:50 crc kubenswrapper[4837]: I0313 11:54:50.486189 4837 generic.go:334] "Generic (PLEG): container finished" podID="fec78503-41e5-45f4-9217-1debe55ec107" containerID="be6a714256c0eb57e3916efdf5b2ce4ae349e4629a72382f7d597ae3648b7920" exitCode=0 Mar 13 11:54:50 crc kubenswrapper[4837]: I0313 11:54:50.486246 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5v5n" event={"ID":"fec78503-41e5-45f4-9217-1debe55ec107","Type":"ContainerDied","Data":"be6a714256c0eb57e3916efdf5b2ce4ae349e4629a72382f7d597ae3648b7920"} Mar 13 11:54:50 crc kubenswrapper[4837]: I0313 11:54:50.489973 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zckjb" event={"ID":"4298f221-fd11-49a1-a0e9-6f95dbdedc44","Type":"ContainerStarted","Data":"860677672a2e1ac86a6b3b31b163a6c722e4b411174f630862f72771875670b8"} Mar 13 11:54:50 crc kubenswrapper[4837]: I0313 11:54:50.513868 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kpp2z" podStartSLOduration=3.051657229 podStartE2EDuration="5.51385263s" podCreationTimestamp="2026-03-13 11:54:45 +0000 UTC" firstStartedPulling="2026-03-13 11:54:47.442738674 +0000 UTC m=+403.081005437" lastFinishedPulling="2026-03-13 11:54:49.904934075 +0000 UTC m=+405.543200838" observedRunningTime="2026-03-13 11:54:50.511665991 +0000 UTC m=+406.149932754" watchObservedRunningTime="2026-03-13 11:54:50.51385263 +0000 UTC m=+406.152119393" Mar 13 11:54:50 crc kubenswrapper[4837]: I0313 11:54:50.557825 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zckjb" podStartSLOduration=2.186153118 podStartE2EDuration="3.55780051s" podCreationTimestamp="2026-03-13 11:54:47 +0000 UTC" firstStartedPulling="2026-03-13 11:54:48.455745202 +0000 UTC m=+404.094011965" lastFinishedPulling="2026-03-13 11:54:49.827392594 +0000 UTC m=+405.465659357" observedRunningTime="2026-03-13 11:54:50.55560303 +0000 UTC m=+406.193869793" watchObservedRunningTime="2026-03-13 11:54:50.55780051 +0000 UTC m=+406.196067273" Mar 13 11:54:51 crc kubenswrapper[4837]: I0313 11:54:51.496759 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5v5n" event={"ID":"fec78503-41e5-45f4-9217-1debe55ec107","Type":"ContainerStarted","Data":"01939a6de402f495d6072809827b6480bdfc5a111a835043e7e862cf70198082"} Mar 13 11:54:51 crc kubenswrapper[4837]: I0313 11:54:51.524783 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m5v5n" podStartSLOduration=2.095969219 podStartE2EDuration="3.524756111s" podCreationTimestamp="2026-03-13 11:54:48 +0000 UTC" firstStartedPulling="2026-03-13 11:54:49.474363549 +0000 UTC m=+405.112630312" lastFinishedPulling="2026-03-13 11:54:50.903150421 +0000 UTC m=+406.541417204" observedRunningTime="2026-03-13 11:54:51.521814967 +0000 UTC m=+407.160081750" watchObservedRunningTime="2026-03-13 11:54:51.524756111 +0000 UTC m=+407.163022874" Mar 13 11:54:54 crc kubenswrapper[4837]: I0313 11:54:54.726127 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:54 crc kubenswrapper[4837]: I0313 11:54:54.726496 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:54 crc kubenswrapper[4837]: I0313 11:54:54.763580 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:55 crc kubenswrapper[4837]: I0313 11:54:55.560242 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:56 crc kubenswrapper[4837]: I0313 11:54:56.082605 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:56 crc kubenswrapper[4837]: I0313 11:54:56.082660 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:57 crc kubenswrapper[4837]: I0313 11:54:57.122612 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kpp2z" podUID="8d96905d-521e-4ab9-87a8-d6edd0c027ed" containerName="registry-server" probeResult="failure" output=< Mar 13 11:54:57 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Mar 13 11:54:57 crc kubenswrapper[4837]: > Mar 13 11:54:57 crc kubenswrapper[4837]: I0313 11:54:57.477986 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:57 crc kubenswrapper[4837]: I0313 11:54:57.478106 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:57 crc kubenswrapper[4837]: I0313 11:54:57.519070 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:57 crc kubenswrapper[4837]: I0313 11:54:57.563088 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:58 crc kubenswrapper[4837]: I0313 11:54:58.480973 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:58 crc kubenswrapper[4837]: I0313 11:54:58.481009 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:58 crc kubenswrapper[4837]: I0313 11:54:58.586860 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:58 crc kubenswrapper[4837]: I0313 11:54:58.625874 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:55:05 crc kubenswrapper[4837]: I0313 11:55:05.484443 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:55:05 crc kubenswrapper[4837]: I0313 11:55:05.485054 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:55:06 crc kubenswrapper[4837]: I0313 11:55:06.130321 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:55:06 crc kubenswrapper[4837]: I0313 11:55:06.185118 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.275405 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-l62n7"] Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.276041 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.295782 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-l62n7"] Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.315207 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/411da074-7224-4c1c-a75a-a5c3f29c0e92-registry-tls\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.315272 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxjlr\" (UniqueName: \"kubernetes.io/projected/411da074-7224-4c1c-a75a-a5c3f29c0e92-kube-api-access-gxjlr\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.315331 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/411da074-7224-4c1c-a75a-a5c3f29c0e92-registry-certificates\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.315445 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/411da074-7224-4c1c-a75a-a5c3f29c0e92-bound-sa-token\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.315498 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.315519 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/411da074-7224-4c1c-a75a-a5c3f29c0e92-installation-pull-secrets\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.315614 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/411da074-7224-4c1c-a75a-a5c3f29c0e92-trusted-ca\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.315704 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/411da074-7224-4c1c-a75a-a5c3f29c0e92-ca-trust-extracted\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.339762 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.416544 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/411da074-7224-4c1c-a75a-a5c3f29c0e92-registry-tls\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.416596 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxjlr\" (UniqueName: \"kubernetes.io/projected/411da074-7224-4c1c-a75a-a5c3f29c0e92-kube-api-access-gxjlr\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.416628 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/411da074-7224-4c1c-a75a-a5c3f29c0e92-registry-certificates\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.416680 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/411da074-7224-4c1c-a75a-a5c3f29c0e92-bound-sa-token\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.416705 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/411da074-7224-4c1c-a75a-a5c3f29c0e92-installation-pull-secrets\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.416730 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/411da074-7224-4c1c-a75a-a5c3f29c0e92-trusted-ca\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.416758 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/411da074-7224-4c1c-a75a-a5c3f29c0e92-ca-trust-extracted\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.417394 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/411da074-7224-4c1c-a75a-a5c3f29c0e92-ca-trust-extracted\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.418687 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/411da074-7224-4c1c-a75a-a5c3f29c0e92-trusted-ca\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.418746 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/411da074-7224-4c1c-a75a-a5c3f29c0e92-registry-certificates\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.425465 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/411da074-7224-4c1c-a75a-a5c3f29c0e92-registry-tls\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.434445 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/411da074-7224-4c1c-a75a-a5c3f29c0e92-installation-pull-secrets\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.438730 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxjlr\" (UniqueName: \"kubernetes.io/projected/411da074-7224-4c1c-a75a-a5c3f29c0e92-kube-api-access-gxjlr\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.440812 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/411da074-7224-4c1c-a75a-a5c3f29c0e92-bound-sa-token\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.593895 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:08 crc kubenswrapper[4837]: I0313 11:55:08.009613 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-l62n7"] Mar 13 11:55:08 crc kubenswrapper[4837]: I0313 11:55:08.591064 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" event={"ID":"411da074-7224-4c1c-a75a-a5c3f29c0e92","Type":"ContainerStarted","Data":"fa9d50ab5efd4579c0c94984a013857fdff7fde0ce55c9c51952d9d7399085ac"} Mar 13 11:55:08 crc kubenswrapper[4837]: I0313 11:55:08.591558 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" event={"ID":"411da074-7224-4c1c-a75a-a5c3f29c0e92","Type":"ContainerStarted","Data":"6608129f77fac4bfe6ff46697f6b6f8e01a96ebc0c5e019b1e4e787315379858"} Mar 13 11:55:08 crc kubenswrapper[4837]: I0313 11:55:08.591724 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:08 crc kubenswrapper[4837]: I0313 11:55:08.617834 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" podStartSLOduration=1.617813887 podStartE2EDuration="1.617813887s" podCreationTimestamp="2026-03-13 11:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:55:08.612154666 +0000 UTC m=+424.250421429" watchObservedRunningTime="2026-03-13 11:55:08.617813887 +0000 UTC m=+424.256080650" Mar 13 11:55:27 crc kubenswrapper[4837]: I0313 11:55:27.600090 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:27 crc kubenswrapper[4837]: I0313 11:55:27.643719 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2w96t"] Mar 13 11:55:35 crc kubenswrapper[4837]: I0313 11:55:35.483617 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:55:35 crc kubenswrapper[4837]: I0313 11:55:35.484144 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:55:52 crc kubenswrapper[4837]: I0313 11:55:52.691323 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" podUID="9da9cfd5-f798-42e0-af98-8378cf8d1e5f" containerName="registry" containerID="cri-o://7f6dc77957ef0c3112728bef3166915837fb45018b662ba23f21fb9a5b1d11d9" gracePeriod=30 Mar 13 11:55:52 crc kubenswrapper[4837]: I0313 11:55:52.839372 4837 generic.go:334] "Generic (PLEG): container finished" podID="9da9cfd5-f798-42e0-af98-8378cf8d1e5f" containerID="7f6dc77957ef0c3112728bef3166915837fb45018b662ba23f21fb9a5b1d11d9" exitCode=0 Mar 13 11:55:52 crc kubenswrapper[4837]: I0313 11:55:52.839417 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" event={"ID":"9da9cfd5-f798-42e0-af98-8378cf8d1e5f","Type":"ContainerDied","Data":"7f6dc77957ef0c3112728bef3166915837fb45018b662ba23f21fb9a5b1d11d9"} Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.044110 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.224911 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-trusted-ca\") pod \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.224983 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-registry-certificates\") pod \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.225053 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-bound-sa-token\") pod \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.225410 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.225471 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-ca-trust-extracted\") pod \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.225518 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-installation-pull-secrets\") pod \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.225579 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-registry-tls\") pod \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.225694 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7xs7\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-kube-api-access-g7xs7\") pod \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.226140 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9da9cfd5-f798-42e0-af98-8378cf8d1e5f" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.226827 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "9da9cfd5-f798-42e0-af98-8378cf8d1e5f" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.231256 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-kube-api-access-g7xs7" (OuterVolumeSpecName: "kube-api-access-g7xs7") pod "9da9cfd5-f798-42e0-af98-8378cf8d1e5f" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f"). InnerVolumeSpecName "kube-api-access-g7xs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.235919 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "9da9cfd5-f798-42e0-af98-8378cf8d1e5f" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.235921 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "9da9cfd5-f798-42e0-af98-8378cf8d1e5f" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.236344 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9da9cfd5-f798-42e0-af98-8378cf8d1e5f" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:55:53 crc kubenswrapper[4837]: E0313 11:55:53.237070 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:9da9cfd5-f798-42e0-af98-8378cf8d1e5f nodeName:}" failed. No retries permitted until 2026-03-13 11:55:53.737032374 +0000 UTC m=+469.375299207 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "registry-storage" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "9da9cfd5-f798-42e0-af98-8378cf8d1e5f" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: Unmounter.TearDownAt failed: rpc error: code = Unknown desc = check target path: could not get consistent content of /proc/mounts after 3 attempts Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.240445 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "9da9cfd5-f798-42e0-af98-8378cf8d1e5f" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.326725 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.326771 4837 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.326785 4837 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.326797 4837 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.326807 4837 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.326819 4837 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.326830 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7xs7\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-kube-api-access-g7xs7\") on node \"crc\" DevicePath \"\"" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.833571 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.841367 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "9da9cfd5-f798-42e0-af98-8378cf8d1e5f" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.850185 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" event={"ID":"9da9cfd5-f798-42e0-af98-8378cf8d1e5f","Type":"ContainerDied","Data":"791f2e4e796f079af101ec362853eaa486bb3e46d120e36fdb1c000b9b27a22e"} Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.850232 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.850242 4837 scope.go:117] "RemoveContainer" containerID="7f6dc77957ef0c3112728bef3166915837fb45018b662ba23f21fb9a5b1d11d9" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.879686 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2w96t"] Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.884195 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2w96t"] Mar 13 11:55:55 crc kubenswrapper[4837]: I0313 11:55:55.056323 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9da9cfd5-f798-42e0-af98-8378cf8d1e5f" path="/var/lib/kubelet/pods/9da9cfd5-f798-42e0-af98-8378cf8d1e5f/volumes" Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.125775 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556716-csq4j"] Mar 13 11:56:00 crc kubenswrapper[4837]: E0313 11:56:00.127569 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da9cfd5-f798-42e0-af98-8378cf8d1e5f" containerName="registry" Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.127606 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da9cfd5-f798-42e0-af98-8378cf8d1e5f" containerName="registry" Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.127748 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9da9cfd5-f798-42e0-af98-8378cf8d1e5f" containerName="registry" Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.128408 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556716-csq4j" Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.131438 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.132008 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.132528 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.137985 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556716-csq4j"] Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.309025 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltgvw\" (UniqueName: \"kubernetes.io/projected/0a7b275e-9d21-4da0-8bb8-0fee8434ce82-kube-api-access-ltgvw\") pod \"auto-csr-approver-29556716-csq4j\" (UID: \"0a7b275e-9d21-4da0-8bb8-0fee8434ce82\") " pod="openshift-infra/auto-csr-approver-29556716-csq4j" Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.410785 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltgvw\" (UniqueName: \"kubernetes.io/projected/0a7b275e-9d21-4da0-8bb8-0fee8434ce82-kube-api-access-ltgvw\") pod \"auto-csr-approver-29556716-csq4j\" (UID: \"0a7b275e-9d21-4da0-8bb8-0fee8434ce82\") " pod="openshift-infra/auto-csr-approver-29556716-csq4j" Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.430353 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltgvw\" (UniqueName: \"kubernetes.io/projected/0a7b275e-9d21-4da0-8bb8-0fee8434ce82-kube-api-access-ltgvw\") pod \"auto-csr-approver-29556716-csq4j\" (UID: \"0a7b275e-9d21-4da0-8bb8-0fee8434ce82\") " pod="openshift-infra/auto-csr-approver-29556716-csq4j" Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.463715 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556716-csq4j" Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.634386 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556716-csq4j"] Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.885698 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556716-csq4j" event={"ID":"0a7b275e-9d21-4da0-8bb8-0fee8434ce82","Type":"ContainerStarted","Data":"abd71683adb08a7e3429bf21f122cd526828d14dc11d94dfe8a93cf1f3cae919"} Mar 13 11:56:02 crc kubenswrapper[4837]: I0313 11:56:02.900356 4837 generic.go:334] "Generic (PLEG): container finished" podID="0a7b275e-9d21-4da0-8bb8-0fee8434ce82" containerID="b8629809cebf6aa743a349229b16e8ffb9aaa032ac5c2d5f39b44ba6478a1a13" exitCode=0 Mar 13 11:56:02 crc kubenswrapper[4837]: I0313 11:56:02.900468 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556716-csq4j" event={"ID":"0a7b275e-9d21-4da0-8bb8-0fee8434ce82","Type":"ContainerDied","Data":"b8629809cebf6aa743a349229b16e8ffb9aaa032ac5c2d5f39b44ba6478a1a13"} Mar 13 11:56:04 crc kubenswrapper[4837]: I0313 11:56:04.132304 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556716-csq4j" Mar 13 11:56:04 crc kubenswrapper[4837]: I0313 11:56:04.261153 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltgvw\" (UniqueName: \"kubernetes.io/projected/0a7b275e-9d21-4da0-8bb8-0fee8434ce82-kube-api-access-ltgvw\") pod \"0a7b275e-9d21-4da0-8bb8-0fee8434ce82\" (UID: \"0a7b275e-9d21-4da0-8bb8-0fee8434ce82\") " Mar 13 11:56:04 crc kubenswrapper[4837]: I0313 11:56:04.266318 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a7b275e-9d21-4da0-8bb8-0fee8434ce82-kube-api-access-ltgvw" (OuterVolumeSpecName: "kube-api-access-ltgvw") pod "0a7b275e-9d21-4da0-8bb8-0fee8434ce82" (UID: "0a7b275e-9d21-4da0-8bb8-0fee8434ce82"). InnerVolumeSpecName "kube-api-access-ltgvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:56:04 crc kubenswrapper[4837]: I0313 11:56:04.362569 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltgvw\" (UniqueName: \"kubernetes.io/projected/0a7b275e-9d21-4da0-8bb8-0fee8434ce82-kube-api-access-ltgvw\") on node \"crc\" DevicePath \"\"" Mar 13 11:56:04 crc kubenswrapper[4837]: I0313 11:56:04.914940 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556716-csq4j" event={"ID":"0a7b275e-9d21-4da0-8bb8-0fee8434ce82","Type":"ContainerDied","Data":"abd71683adb08a7e3429bf21f122cd526828d14dc11d94dfe8a93cf1f3cae919"} Mar 13 11:56:04 crc kubenswrapper[4837]: I0313 11:56:04.915269 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abd71683adb08a7e3429bf21f122cd526828d14dc11d94dfe8a93cf1f3cae919" Mar 13 11:56:04 crc kubenswrapper[4837]: I0313 11:56:04.914993 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556716-csq4j" Mar 13 11:56:05 crc kubenswrapper[4837]: I0313 11:56:05.200034 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556710-lcprh"] Mar 13 11:56:05 crc kubenswrapper[4837]: I0313 11:56:05.203271 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556710-lcprh"] Mar 13 11:56:05 crc kubenswrapper[4837]: I0313 11:56:05.483716 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:56:05 crc kubenswrapper[4837]: I0313 11:56:05.483807 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:56:05 crc kubenswrapper[4837]: I0313 11:56:05.483881 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:56:05 crc kubenswrapper[4837]: I0313 11:56:05.484860 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea590165224c827e615cf9230078895eabcfa03489c1d34d92662f043fe58752"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 11:56:05 crc kubenswrapper[4837]: I0313 11:56:05.485030 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://ea590165224c827e615cf9230078895eabcfa03489c1d34d92662f043fe58752" gracePeriod=600 Mar 13 11:56:05 crc kubenswrapper[4837]: I0313 11:56:05.922723 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="ea590165224c827e615cf9230078895eabcfa03489c1d34d92662f043fe58752" exitCode=0 Mar 13 11:56:05 crc kubenswrapper[4837]: I0313 11:56:05.922763 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"ea590165224c827e615cf9230078895eabcfa03489c1d34d92662f043fe58752"} Mar 13 11:56:05 crc kubenswrapper[4837]: I0313 11:56:05.923014 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"2bc74d238c7c3c8f94dfb05f2715b04d643751479532bb38893e7ef8db5a10d2"} Mar 13 11:56:05 crc kubenswrapper[4837]: I0313 11:56:05.923034 4837 scope.go:117] "RemoveContainer" containerID="87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a" Mar 13 11:56:07 crc kubenswrapper[4837]: I0313 11:56:07.056332 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0484d991-f239-47a2-80ff-0237945c27ac" path="/var/lib/kubelet/pods/0484d991-f239-47a2-80ff-0237945c27ac/volumes" Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.143883 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556718-7z6qj"] Mar 13 11:58:00 crc kubenswrapper[4837]: E0313 11:58:00.144947 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a7b275e-9d21-4da0-8bb8-0fee8434ce82" containerName="oc" Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.144967 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a7b275e-9d21-4da0-8bb8-0fee8434ce82" containerName="oc" Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.145088 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a7b275e-9d21-4da0-8bb8-0fee8434ce82" containerName="oc" Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.145624 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556718-7z6qj" Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.149899 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.149995 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.150012 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.150329 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556718-7z6qj"] Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.177044 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qskx\" (UniqueName: \"kubernetes.io/projected/aa01e7a4-71d3-4c91-8319-52a575269601-kube-api-access-2qskx\") pod \"auto-csr-approver-29556718-7z6qj\" (UID: \"aa01e7a4-71d3-4c91-8319-52a575269601\") " pod="openshift-infra/auto-csr-approver-29556718-7z6qj" Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.278032 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qskx\" (UniqueName: \"kubernetes.io/projected/aa01e7a4-71d3-4c91-8319-52a575269601-kube-api-access-2qskx\") pod \"auto-csr-approver-29556718-7z6qj\" (UID: \"aa01e7a4-71d3-4c91-8319-52a575269601\") " pod="openshift-infra/auto-csr-approver-29556718-7z6qj" Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.298382 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qskx\" (UniqueName: \"kubernetes.io/projected/aa01e7a4-71d3-4c91-8319-52a575269601-kube-api-access-2qskx\") pod \"auto-csr-approver-29556718-7z6qj\" (UID: \"aa01e7a4-71d3-4c91-8319-52a575269601\") " pod="openshift-infra/auto-csr-approver-29556718-7z6qj" Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.471595 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556718-7z6qj" Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.682910 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556718-7z6qj"] Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.688362 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 11:58:01 crc kubenswrapper[4837]: I0313 11:58:01.613757 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556718-7z6qj" event={"ID":"aa01e7a4-71d3-4c91-8319-52a575269601","Type":"ContainerStarted","Data":"82ceeb88401e1a8931c1864b4f9ba89d57c9e6f7a5005ce9219be818b9826cdb"} Mar 13 11:58:02 crc kubenswrapper[4837]: I0313 11:58:02.621148 4837 generic.go:334] "Generic (PLEG): container finished" podID="aa01e7a4-71d3-4c91-8319-52a575269601" containerID="f165f764ee51b6b29672c3c9a0ac54376301b2d6f3ce983abfa09b63813909b9" exitCode=0 Mar 13 11:58:02 crc kubenswrapper[4837]: I0313 11:58:02.621200 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556718-7z6qj" event={"ID":"aa01e7a4-71d3-4c91-8319-52a575269601","Type":"ContainerDied","Data":"f165f764ee51b6b29672c3c9a0ac54376301b2d6f3ce983abfa09b63813909b9"} Mar 13 11:58:03 crc kubenswrapper[4837]: I0313 11:58:03.870697 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556718-7z6qj" Mar 13 11:58:04 crc kubenswrapper[4837]: I0313 11:58:04.022021 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qskx\" (UniqueName: \"kubernetes.io/projected/aa01e7a4-71d3-4c91-8319-52a575269601-kube-api-access-2qskx\") pod \"aa01e7a4-71d3-4c91-8319-52a575269601\" (UID: \"aa01e7a4-71d3-4c91-8319-52a575269601\") " Mar 13 11:58:04 crc kubenswrapper[4837]: I0313 11:58:04.027773 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa01e7a4-71d3-4c91-8319-52a575269601-kube-api-access-2qskx" (OuterVolumeSpecName: "kube-api-access-2qskx") pod "aa01e7a4-71d3-4c91-8319-52a575269601" (UID: "aa01e7a4-71d3-4c91-8319-52a575269601"). InnerVolumeSpecName "kube-api-access-2qskx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:58:04 crc kubenswrapper[4837]: I0313 11:58:04.123351 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qskx\" (UniqueName: \"kubernetes.io/projected/aa01e7a4-71d3-4c91-8319-52a575269601-kube-api-access-2qskx\") on node \"crc\" DevicePath \"\"" Mar 13 11:58:04 crc kubenswrapper[4837]: I0313 11:58:04.638548 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556718-7z6qj" event={"ID":"aa01e7a4-71d3-4c91-8319-52a575269601","Type":"ContainerDied","Data":"82ceeb88401e1a8931c1864b4f9ba89d57c9e6f7a5005ce9219be818b9826cdb"} Mar 13 11:58:04 crc kubenswrapper[4837]: I0313 11:58:04.638594 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82ceeb88401e1a8931c1864b4f9ba89d57c9e6f7a5005ce9219be818b9826cdb" Mar 13 11:58:04 crc kubenswrapper[4837]: I0313 11:58:04.638681 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556718-7z6qj" Mar 13 11:58:04 crc kubenswrapper[4837]: I0313 11:58:04.938702 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556712-g8877"] Mar 13 11:58:04 crc kubenswrapper[4837]: I0313 11:58:04.944908 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556712-g8877"] Mar 13 11:58:05 crc kubenswrapper[4837]: I0313 11:58:05.059875 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87edec8a-33b2-44c0-bbcb-1e4f5dded1b2" path="/var/lib/kubelet/pods/87edec8a-33b2-44c0-bbcb-1e4f5dded1b2/volumes" Mar 13 11:58:05 crc kubenswrapper[4837]: I0313 11:58:05.484512 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:58:05 crc kubenswrapper[4837]: I0313 11:58:05.484589 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:58:21 crc kubenswrapper[4837]: I0313 11:58:21.629800 4837 scope.go:117] "RemoveContainer" containerID="7788f0babcbd0ba3005289dc42abd3560a56f1f0efe57b0376342454820793c4" Mar 13 11:58:35 crc kubenswrapper[4837]: I0313 11:58:35.484151 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:58:35 crc kubenswrapper[4837]: I0313 11:58:35.484777 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:59:05 crc kubenswrapper[4837]: I0313 11:59:05.483619 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:59:05 crc kubenswrapper[4837]: I0313 11:59:05.484199 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:59:05 crc kubenswrapper[4837]: I0313 11:59:05.484244 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:59:05 crc kubenswrapper[4837]: I0313 11:59:05.484848 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2bc74d238c7c3c8f94dfb05f2715b04d643751479532bb38893e7ef8db5a10d2"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 11:59:05 crc kubenswrapper[4837]: I0313 11:59:05.484939 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://2bc74d238c7c3c8f94dfb05f2715b04d643751479532bb38893e7ef8db5a10d2" gracePeriod=600 Mar 13 11:59:06 crc kubenswrapper[4837]: I0313 11:59:06.031539 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="2bc74d238c7c3c8f94dfb05f2715b04d643751479532bb38893e7ef8db5a10d2" exitCode=0 Mar 13 11:59:06 crc kubenswrapper[4837]: I0313 11:59:06.031870 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"2bc74d238c7c3c8f94dfb05f2715b04d643751479532bb38893e7ef8db5a10d2"} Mar 13 11:59:06 crc kubenswrapper[4837]: I0313 11:59:06.031974 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"86010f8ae6e03e22840b0db405e4816a52e1a80af0eff6188dd5d3d81e63937a"} Mar 13 11:59:06 crc kubenswrapper[4837]: I0313 11:59:06.032016 4837 scope.go:117] "RemoveContainer" containerID="ea590165224c827e615cf9230078895eabcfa03489c1d34d92662f043fe58752" Mar 13 11:59:21 crc kubenswrapper[4837]: I0313 11:59:21.670629 4837 scope.go:117] "RemoveContainer" containerID="8c4d75bce91d26c5c90ccce3126b557507017a92b0dd1db884cee46957fc8b2f" Mar 13 11:59:21 crc kubenswrapper[4837]: I0313 11:59:21.722387 4837 scope.go:117] "RemoveContainer" containerID="b2ba4ee22041e914a3b1573c300fce67d3ac337c4d9b3d85c86421a82bc9711f" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.661279 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-xzv5h"] Mar 13 11:59:40 crc kubenswrapper[4837]: E0313 11:59:40.662088 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa01e7a4-71d3-4c91-8319-52a575269601" containerName="oc" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.662105 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa01e7a4-71d3-4c91-8319-52a575269601" containerName="oc" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.662232 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa01e7a4-71d3-4c91-8319-52a575269601" containerName="oc" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.662682 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-xzv5h" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.664867 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.665712 4837 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-xdfzs" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.666414 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.693562 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-xzv5h"] Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.701794 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-dlspp"] Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.702613 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-dlspp" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.705845 4837 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-jnncb" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.709501 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-ht9vn"] Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.710576 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-ht9vn" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.716489 4837 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-25pqh" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.718953 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-dlspp"] Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.723336 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-ht9vn"] Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.770048 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47djl\" (UniqueName: \"kubernetes.io/projected/67507b8e-35d5-4dff-9239-45b5ef997e53-kube-api-access-47djl\") pod \"cert-manager-cainjector-cf98fcc89-xzv5h\" (UID: \"67507b8e-35d5-4dff-9239-45b5ef997e53\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-xzv5h" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.770105 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgn78\" (UniqueName: \"kubernetes.io/projected/0e500b82-1f14-4a1e-937d-00248f195033-kube-api-access-xgn78\") pod \"cert-manager-webhook-687f57d79b-ht9vn\" (UID: \"0e500b82-1f14-4a1e-937d-00248f195033\") " pod="cert-manager/cert-manager-webhook-687f57d79b-ht9vn" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.770132 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv2kk\" (UniqueName: \"kubernetes.io/projected/5ecc1237-3421-41d5-8efb-a62399ae1d73-kube-api-access-hv2kk\") pod \"cert-manager-858654f9db-dlspp\" (UID: \"5ecc1237-3421-41d5-8efb-a62399ae1d73\") " pod="cert-manager/cert-manager-858654f9db-dlspp" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.871390 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv2kk\" (UniqueName: \"kubernetes.io/projected/5ecc1237-3421-41d5-8efb-a62399ae1d73-kube-api-access-hv2kk\") pod \"cert-manager-858654f9db-dlspp\" (UID: \"5ecc1237-3421-41d5-8efb-a62399ae1d73\") " pod="cert-manager/cert-manager-858654f9db-dlspp" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.871703 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47djl\" (UniqueName: \"kubernetes.io/projected/67507b8e-35d5-4dff-9239-45b5ef997e53-kube-api-access-47djl\") pod \"cert-manager-cainjector-cf98fcc89-xzv5h\" (UID: \"67507b8e-35d5-4dff-9239-45b5ef997e53\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-xzv5h" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.871803 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgn78\" (UniqueName: \"kubernetes.io/projected/0e500b82-1f14-4a1e-937d-00248f195033-kube-api-access-xgn78\") pod \"cert-manager-webhook-687f57d79b-ht9vn\" (UID: \"0e500b82-1f14-4a1e-937d-00248f195033\") " pod="cert-manager/cert-manager-webhook-687f57d79b-ht9vn" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.889506 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47djl\" (UniqueName: \"kubernetes.io/projected/67507b8e-35d5-4dff-9239-45b5ef997e53-kube-api-access-47djl\") pod \"cert-manager-cainjector-cf98fcc89-xzv5h\" (UID: \"67507b8e-35d5-4dff-9239-45b5ef997e53\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-xzv5h" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.894290 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgn78\" (UniqueName: \"kubernetes.io/projected/0e500b82-1f14-4a1e-937d-00248f195033-kube-api-access-xgn78\") pod \"cert-manager-webhook-687f57d79b-ht9vn\" (UID: \"0e500b82-1f14-4a1e-937d-00248f195033\") " pod="cert-manager/cert-manager-webhook-687f57d79b-ht9vn" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.894311 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv2kk\" (UniqueName: \"kubernetes.io/projected/5ecc1237-3421-41d5-8efb-a62399ae1d73-kube-api-access-hv2kk\") pod \"cert-manager-858654f9db-dlspp\" (UID: \"5ecc1237-3421-41d5-8efb-a62399ae1d73\") " pod="cert-manager/cert-manager-858654f9db-dlspp" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.996595 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-xzv5h" Mar 13 11:59:41 crc kubenswrapper[4837]: I0313 11:59:41.021834 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-dlspp" Mar 13 11:59:41 crc kubenswrapper[4837]: I0313 11:59:41.035167 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-ht9vn" Mar 13 11:59:41 crc kubenswrapper[4837]: I0313 11:59:41.255772 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-dlspp"] Mar 13 11:59:41 crc kubenswrapper[4837]: W0313 11:59:41.264014 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ecc1237_3421_41d5_8efb_a62399ae1d73.slice/crio-7b0568ddce0a7122ede669bc4540653f97b3616bf1c2a18fd7ad5b933bc2ddcd WatchSource:0}: Error finding container 7b0568ddce0a7122ede669bc4540653f97b3616bf1c2a18fd7ad5b933bc2ddcd: Status 404 returned error can't find the container with id 7b0568ddce0a7122ede669bc4540653f97b3616bf1c2a18fd7ad5b933bc2ddcd Mar 13 11:59:41 crc kubenswrapper[4837]: I0313 11:59:41.299496 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-ht9vn"] Mar 13 11:59:41 crc kubenswrapper[4837]: W0313 11:59:41.303798 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e500b82_1f14_4a1e_937d_00248f195033.slice/crio-ee9015113da4c7d3fbcd027aed4b686615b86b9a2292c4739914d0c68ed415d4 WatchSource:0}: Error finding container ee9015113da4c7d3fbcd027aed4b686615b86b9a2292c4739914d0c68ed415d4: Status 404 returned error can't find the container with id ee9015113da4c7d3fbcd027aed4b686615b86b9a2292c4739914d0c68ed415d4 Mar 13 11:59:41 crc kubenswrapper[4837]: I0313 11:59:41.416409 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-xzv5h"] Mar 13 11:59:41 crc kubenswrapper[4837]: W0313 11:59:41.418106 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67507b8e_35d5_4dff_9239_45b5ef997e53.slice/crio-159f09b062282a35a2aaedb5d409c6e711dd40d0c03a3546319f1b4cc4145fe0 WatchSource:0}: Error finding container 159f09b062282a35a2aaedb5d409c6e711dd40d0c03a3546319f1b4cc4145fe0: Status 404 returned error can't find the container with id 159f09b062282a35a2aaedb5d409c6e711dd40d0c03a3546319f1b4cc4145fe0 Mar 13 11:59:42 crc kubenswrapper[4837]: I0313 11:59:42.256997 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-xzv5h" event={"ID":"67507b8e-35d5-4dff-9239-45b5ef997e53","Type":"ContainerStarted","Data":"159f09b062282a35a2aaedb5d409c6e711dd40d0c03a3546319f1b4cc4145fe0"} Mar 13 11:59:42 crc kubenswrapper[4837]: I0313 11:59:42.258741 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-dlspp" event={"ID":"5ecc1237-3421-41d5-8efb-a62399ae1d73","Type":"ContainerStarted","Data":"7b0568ddce0a7122ede669bc4540653f97b3616bf1c2a18fd7ad5b933bc2ddcd"} Mar 13 11:59:42 crc kubenswrapper[4837]: I0313 11:59:42.261471 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-ht9vn" event={"ID":"0e500b82-1f14-4a1e-937d-00248f195033","Type":"ContainerStarted","Data":"ee9015113da4c7d3fbcd027aed4b686615b86b9a2292c4739914d0c68ed415d4"} Mar 13 11:59:45 crc kubenswrapper[4837]: I0313 11:59:45.281032 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-dlspp" event={"ID":"5ecc1237-3421-41d5-8efb-a62399ae1d73","Type":"ContainerStarted","Data":"44d7dab06b3023c31fc534449c16fc8ad640daae562b4e6c0be834a4f3240fd7"} Mar 13 11:59:45 crc kubenswrapper[4837]: I0313 11:59:45.282915 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-xzv5h" event={"ID":"67507b8e-35d5-4dff-9239-45b5ef997e53","Type":"ContainerStarted","Data":"1a9bec7e517ddcc37bb4fd44363c9eda050021d2f6a4bf27d0650acae2c529a7"} Mar 13 11:59:45 crc kubenswrapper[4837]: I0313 11:59:45.307666 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-dlspp" podStartSLOduration=2.8141037669999998 podStartE2EDuration="5.307614523s" podCreationTimestamp="2026-03-13 11:59:40 +0000 UTC" firstStartedPulling="2026-03-13 11:59:41.270708346 +0000 UTC m=+696.908975109" lastFinishedPulling="2026-03-13 11:59:43.764219082 +0000 UTC m=+699.402485865" observedRunningTime="2026-03-13 11:59:45.302256652 +0000 UTC m=+700.940523415" watchObservedRunningTime="2026-03-13 11:59:45.307614523 +0000 UTC m=+700.945881286" Mar 13 11:59:45 crc kubenswrapper[4837]: I0313 11:59:45.319738 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-xzv5h" podStartSLOduration=2.416130128 podStartE2EDuration="5.319714617s" podCreationTimestamp="2026-03-13 11:59:40 +0000 UTC" firstStartedPulling="2026-03-13 11:59:41.420380754 +0000 UTC m=+697.058647517" lastFinishedPulling="2026-03-13 11:59:44.323965253 +0000 UTC m=+699.962232006" observedRunningTime="2026-03-13 11:59:45.316865316 +0000 UTC m=+700.955132099" watchObservedRunningTime="2026-03-13 11:59:45.319714617 +0000 UTC m=+700.957981400" Mar 13 11:59:46 crc kubenswrapper[4837]: I0313 11:59:46.294947 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-ht9vn" event={"ID":"0e500b82-1f14-4a1e-937d-00248f195033","Type":"ContainerStarted","Data":"778bc71e71fb427e60132faf96dc5d3b619b9b60802da5fc5908a5736d49e00f"} Mar 13 11:59:46 crc kubenswrapper[4837]: I0313 11:59:46.321254 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-ht9vn" podStartSLOduration=2.403676423 podStartE2EDuration="6.321225614s" podCreationTimestamp="2026-03-13 11:59:40 +0000 UTC" firstStartedPulling="2026-03-13 11:59:41.305900292 +0000 UTC m=+696.944167055" lastFinishedPulling="2026-03-13 11:59:45.223449483 +0000 UTC m=+700.861716246" observedRunningTime="2026-03-13 11:59:46.314450289 +0000 UTC m=+701.952717062" watchObservedRunningTime="2026-03-13 11:59:46.321225614 +0000 UTC m=+701.959492387" Mar 13 11:59:47 crc kubenswrapper[4837]: I0313 11:59:47.300101 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-ht9vn" Mar 13 11:59:50 crc kubenswrapper[4837]: I0313 11:59:50.728250 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4zzrs"] Mar 13 11:59:50 crc kubenswrapper[4837]: I0313 11:59:50.730298 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="kube-rbac-proxy-node" containerID="cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2" gracePeriod=30 Mar 13 11:59:50 crc kubenswrapper[4837]: I0313 11:59:50.730423 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793" gracePeriod=30 Mar 13 11:59:50 crc kubenswrapper[4837]: I0313 11:59:50.730435 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovn-acl-logging" containerID="cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d" gracePeriod=30 Mar 13 11:59:50 crc kubenswrapper[4837]: I0313 11:59:50.730335 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="northd" containerID="cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7" gracePeriod=30 Mar 13 11:59:50 crc kubenswrapper[4837]: I0313 11:59:50.730528 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="sbdb" containerID="cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a" gracePeriod=30 Mar 13 11:59:50 crc kubenswrapper[4837]: I0313 11:59:50.730252 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovn-controller" containerID="cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1" gracePeriod=30 Mar 13 11:59:50 crc kubenswrapper[4837]: I0313 11:59:50.733550 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="nbdb" containerID="cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39" gracePeriod=30 Mar 13 11:59:50 crc kubenswrapper[4837]: I0313 11:59:50.757283 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" containerID="cri-o://f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282" gracePeriod=30 Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.038124 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-ht9vn" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.107710 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/3.log" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.110458 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovn-acl-logging/0.log" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.111047 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovn-controller/0.log" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.111577 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.172624 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6bbfz"] Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.172861 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.172876 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.172887 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.172893 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.172901 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.172907 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.172914 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovn-acl-logging" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.172922 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovn-acl-logging" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.172933 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.172940 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.172951 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.172960 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.172971 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="northd" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.172976 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="northd" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.172986 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="sbdb" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.172992 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="sbdb" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.173002 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="nbdb" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173009 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="nbdb" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.173017 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="kube-rbac-proxy-node" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173025 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="kube-rbac-proxy-node" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.173035 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="kubecfg-setup" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173041 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="kubecfg-setup" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.173051 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovn-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173056 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovn-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173170 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="northd" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173182 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173191 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="sbdb" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173200 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="nbdb" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173209 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173219 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="kube-rbac-proxy-node" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173228 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173237 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173252 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovn-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173263 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovn-acl-logging" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173272 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.173468 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173479 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173612 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.178545 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.218840 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-kubelet\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.218944 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85hll\" (UniqueName: \"kubernetes.io/projected/43df29f7-1351-41f5-bfca-17f804837cb4-kube-api-access-85hll\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.218984 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219005 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-openvswitch\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219078 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-systemd-units\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219139 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-ovn\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219081 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219213 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219255 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-log-socket" (OuterVolumeSpecName: "log-socket") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219207 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-log-socket\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219281 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219287 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-systemd\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219325 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-cni-netd\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219347 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-var-lib-openvswitch\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219374 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-cni-bin\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219404 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219434 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43df29f7-1351-41f5-bfca-17f804837cb4-ovn-node-metrics-cert\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219462 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-etc-openvswitch\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219486 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-ovnkube-config\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219509 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-run-netns\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219536 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-run-ovn-kubernetes\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219555 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-slash\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219588 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-ovnkube-script-lib\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219609 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-env-overrides\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219630 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-node-log\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219880 4837 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219905 4837 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219920 4837 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219932 4837 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-log-socket\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219943 4837 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219976 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-node-log" (OuterVolumeSpecName: "node-log") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.220001 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.220025 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.220047 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.220073 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.220393 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.220481 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.220556 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.220834 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.220849 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.220702 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-slash" (OuterVolumeSpecName: "host-slash") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.221512 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.226458 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43df29f7-1351-41f5-bfca-17f804837cb4-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.226741 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43df29f7-1351-41f5-bfca-17f804837cb4-kube-api-access-85hll" (OuterVolumeSpecName: "kube-api-access-85hll") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "kube-api-access-85hll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.235720 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.321449 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-run-netns\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.321516 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-run-openvswitch\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.321541 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-run-systemd\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.321565 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-etc-openvswitch\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.321586 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-node-log\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.321613 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.321718 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-systemd-units\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.321746 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-log-socket\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.321781 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-slash\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.321810 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b564b0f-ab5a-454b-8588-a645fdec0058-ovn-node-metrics-cert\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.321834 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-var-lib-openvswitch\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322005 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b564b0f-ab5a-454b-8588-a645fdec0058-ovnkube-script-lib\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322082 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-kubelet\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322126 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-cni-bin\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322172 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b564b0f-ab5a-454b-8588-a645fdec0058-env-overrides\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322194 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj4xf\" (UniqueName: \"kubernetes.io/projected/7b564b0f-ab5a-454b-8588-a645fdec0058-kube-api-access-dj4xf\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322247 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-cni-netd\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322294 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b564b0f-ab5a-454b-8588-a645fdec0058-ovnkube-config\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322313 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-run-ovn-kubernetes\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322385 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-run-ovn\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322447 4837 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322486 4837 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322498 4837 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322506 4837 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322520 4837 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322530 4837 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43df29f7-1351-41f5-bfca-17f804837cb4-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322540 4837 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322549 4837 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322559 4837 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322568 4837 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322578 4837 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-slash\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322587 4837 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322595 4837 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-node-log\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322604 4837 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322613 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85hll\" (UniqueName: \"kubernetes.io/projected/43df29f7-1351-41f5-bfca-17f804837cb4-kube-api-access-85hll\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.338500 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qg957_cbb3f4c6-a6c5-4059-8beb-04179d70aff5/kube-multus/2.log" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.339030 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qg957_cbb3f4c6-a6c5-4059-8beb-04179d70aff5/kube-multus/1.log" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.339078 4837 generic.go:334] "Generic (PLEG): container finished" podID="cbb3f4c6-a6c5-4059-8beb-04179d70aff5" containerID="1effae1c86d3c4f5369295262f269b1dad692c561321e1c868d2b4fe7f736d7c" exitCode=2 Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.339154 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qg957" event={"ID":"cbb3f4c6-a6c5-4059-8beb-04179d70aff5","Type":"ContainerDied","Data":"1effae1c86d3c4f5369295262f269b1dad692c561321e1c868d2b4fe7f736d7c"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.339195 4837 scope.go:117] "RemoveContainer" containerID="19b8a72f10c691a74098997e9d2383adf1aeb1811ad22dc8a74b5a47945d1e3e" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.339710 4837 scope.go:117] "RemoveContainer" containerID="1effae1c86d3c4f5369295262f269b1dad692c561321e1c868d2b4fe7f736d7c" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.339925 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-qg957_openshift-multus(cbb3f4c6-a6c5-4059-8beb-04179d70aff5)\"" pod="openshift-multus/multus-qg957" podUID="cbb3f4c6-a6c5-4059-8beb-04179d70aff5" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.341546 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/3.log" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.344949 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovn-acl-logging/0.log" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.345455 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovn-controller/0.log" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346019 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282" exitCode=0 Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346184 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a" exitCode=0 Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346198 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39" exitCode=0 Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346208 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7" exitCode=0 Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346104 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346316 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346166 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346366 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346385 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346406 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346218 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793" exitCode=0 Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346443 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2" exitCode=0 Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346462 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d" exitCode=143 Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346477 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1" exitCode=143 Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346500 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346524 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346540 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346570 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346581 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346592 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346602 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346613 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346620 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346627 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346654 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346684 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346700 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346709 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346717 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346724 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346732 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346739 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346746 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346754 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346762 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346769 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346779 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346791 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346798 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346805 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346812 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346819 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346825 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346835 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346842 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346850 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346857 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346867 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"17148b76b47a8d352ae2adca8c21dbaa4b189a84d57c2f7678c2d83f59bfc901"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346880 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346888 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346896 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346905 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346912 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346920 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346927 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346933 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346941 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346947 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.382804 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4zzrs"] Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.388555 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4zzrs"] Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424451 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-etc-openvswitch\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424531 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-node-log\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424564 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424587 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-systemd-units\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424613 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-log-socket\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424623 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-etc-openvswitch\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424692 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-slash\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424698 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-log-socket\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424735 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-slash\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424746 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b564b0f-ab5a-454b-8588-a645fdec0058-ovn-node-metrics-cert\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424766 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-node-log\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424795 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-var-lib-openvswitch\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424828 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b564b0f-ab5a-454b-8588-a645fdec0058-ovnkube-script-lib\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424851 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-kubelet\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424879 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-cni-bin\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424900 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b564b0f-ab5a-454b-8588-a645fdec0058-env-overrides\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424924 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj4xf\" (UniqueName: \"kubernetes.io/projected/7b564b0f-ab5a-454b-8588-a645fdec0058-kube-api-access-dj4xf\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424948 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-cni-netd\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424938 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-systemd-units\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424981 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b564b0f-ab5a-454b-8588-a645fdec0058-ovnkube-config\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424730 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425272 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-cni-bin\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425299 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-var-lib-openvswitch\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425304 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-run-ovn-kubernetes\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425348 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-run-ovn\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425443 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-run-netns\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425477 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-run-openvswitch\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425533 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-run-systemd\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425553 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-run-ovn\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425710 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-run-openvswitch\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425744 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-kubelet\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425753 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-run-systemd\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425788 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-run-ovn-kubernetes\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425825 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-cni-netd\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425845 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-run-netns\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425867 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b564b0f-ab5a-454b-8588-a645fdec0058-ovnkube-config\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425888 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b564b0f-ab5a-454b-8588-a645fdec0058-ovnkube-script-lib\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.426514 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b564b0f-ab5a-454b-8588-a645fdec0058-env-overrides\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.428539 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b564b0f-ab5a-454b-8588-a645fdec0058-ovn-node-metrics-cert\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.444146 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj4xf\" (UniqueName: \"kubernetes.io/projected/7b564b0f-ab5a-454b-8588-a645fdec0058-kube-api-access-dj4xf\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.451792 4837 scope.go:117] "RemoveContainer" containerID="f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.481972 4837 scope.go:117] "RemoveContainer" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.499108 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.512163 4837 scope.go:117] "RemoveContainer" containerID="60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.547400 4837 scope.go:117] "RemoveContainer" containerID="80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.571781 4837 scope.go:117] "RemoveContainer" containerID="7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.593550 4837 scope.go:117] "RemoveContainer" containerID="954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.614258 4837 scope.go:117] "RemoveContainer" containerID="bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.636952 4837 scope.go:117] "RemoveContainer" containerID="c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.655348 4837 scope.go:117] "RemoveContainer" containerID="b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.678493 4837 scope.go:117] "RemoveContainer" containerID="4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.705720 4837 scope.go:117] "RemoveContainer" containerID="f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.706821 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282\": container with ID starting with f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282 not found: ID does not exist" containerID="f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.706885 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282"} err="failed to get container status \"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282\": rpc error: code = NotFound desc = could not find container \"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282\": container with ID starting with f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.706912 4837 scope.go:117] "RemoveContainer" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.707321 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\": container with ID starting with 01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c not found: ID does not exist" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.707390 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c"} err="failed to get container status \"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\": rpc error: code = NotFound desc = could not find container \"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\": container with ID starting with 01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.707431 4837 scope.go:117] "RemoveContainer" containerID="60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.707970 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\": container with ID starting with 60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a not found: ID does not exist" containerID="60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.708001 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a"} err="failed to get container status \"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\": rpc error: code = NotFound desc = could not find container \"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\": container with ID starting with 60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.708022 4837 scope.go:117] "RemoveContainer" containerID="80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.708316 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\": container with ID starting with 80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39 not found: ID does not exist" containerID="80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.708340 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39"} err="failed to get container status \"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\": rpc error: code = NotFound desc = could not find container \"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\": container with ID starting with 80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.708356 4837 scope.go:117] "RemoveContainer" containerID="7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.708710 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\": container with ID starting with 7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7 not found: ID does not exist" containerID="7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.708819 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7"} err="failed to get container status \"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\": rpc error: code = NotFound desc = could not find container \"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\": container with ID starting with 7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.708842 4837 scope.go:117] "RemoveContainer" containerID="954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.709118 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\": container with ID starting with 954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793 not found: ID does not exist" containerID="954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.709142 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793"} err="failed to get container status \"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\": rpc error: code = NotFound desc = could not find container \"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\": container with ID starting with 954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.709158 4837 scope.go:117] "RemoveContainer" containerID="bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.709467 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\": container with ID starting with bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2 not found: ID does not exist" containerID="bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.709496 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2"} err="failed to get container status \"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\": rpc error: code = NotFound desc = could not find container \"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\": container with ID starting with bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.709517 4837 scope.go:117] "RemoveContainer" containerID="c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.709830 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\": container with ID starting with c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d not found: ID does not exist" containerID="c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.709861 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d"} err="failed to get container status \"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\": rpc error: code = NotFound desc = could not find container \"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\": container with ID starting with c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.709884 4837 scope.go:117] "RemoveContainer" containerID="b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.710146 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\": container with ID starting with b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1 not found: ID does not exist" containerID="b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.710173 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1"} err="failed to get container status \"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\": rpc error: code = NotFound desc = could not find container \"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\": container with ID starting with b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.710193 4837 scope.go:117] "RemoveContainer" containerID="4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.710408 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\": container with ID starting with 4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60 not found: ID does not exist" containerID="4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.710430 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60"} err="failed to get container status \"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\": rpc error: code = NotFound desc = could not find container \"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\": container with ID starting with 4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.710444 4837 scope.go:117] "RemoveContainer" containerID="f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.711832 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282"} err="failed to get container status \"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282\": rpc error: code = NotFound desc = could not find container \"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282\": container with ID starting with f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.711870 4837 scope.go:117] "RemoveContainer" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.712146 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c"} err="failed to get container status \"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\": rpc error: code = NotFound desc = could not find container \"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\": container with ID starting with 01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.712175 4837 scope.go:117] "RemoveContainer" containerID="60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.712528 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a"} err="failed to get container status \"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\": rpc error: code = NotFound desc = could not find container \"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\": container with ID starting with 60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.712565 4837 scope.go:117] "RemoveContainer" containerID="80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.712921 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39"} err="failed to get container status \"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\": rpc error: code = NotFound desc = could not find container \"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\": container with ID starting with 80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.712945 4837 scope.go:117] "RemoveContainer" containerID="7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.713824 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7"} err="failed to get container status \"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\": rpc error: code = NotFound desc = could not find container \"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\": container with ID starting with 7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.713854 4837 scope.go:117] "RemoveContainer" containerID="954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.714163 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793"} err="failed to get container status \"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\": rpc error: code = NotFound desc = could not find container \"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\": container with ID starting with 954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.714188 4837 scope.go:117] "RemoveContainer" containerID="bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.714487 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2"} err="failed to get container status \"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\": rpc error: code = NotFound desc = could not find container \"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\": container with ID starting with bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.714501 4837 scope.go:117] "RemoveContainer" containerID="c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.715368 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d"} err="failed to get container status \"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\": rpc error: code = NotFound desc = could not find container \"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\": container with ID starting with c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.715388 4837 scope.go:117] "RemoveContainer" containerID="b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.715815 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1"} err="failed to get container status \"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\": rpc error: code = NotFound desc = could not find container \"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\": container with ID starting with b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.715841 4837 scope.go:117] "RemoveContainer" containerID="4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.716045 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60"} err="failed to get container status \"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\": rpc error: code = NotFound desc = could not find container \"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\": container with ID starting with 4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.716060 4837 scope.go:117] "RemoveContainer" containerID="f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.716267 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282"} err="failed to get container status \"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282\": rpc error: code = NotFound desc = could not find container \"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282\": container with ID starting with f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.716284 4837 scope.go:117] "RemoveContainer" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.716485 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c"} err="failed to get container status \"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\": rpc error: code = NotFound desc = could not find container \"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\": container with ID starting with 01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.716502 4837 scope.go:117] "RemoveContainer" containerID="60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.716730 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a"} err="failed to get container status \"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\": rpc error: code = NotFound desc = could not find container \"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\": container with ID starting with 60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.716750 4837 scope.go:117] "RemoveContainer" containerID="80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.718365 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39"} err="failed to get container status \"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\": rpc error: code = NotFound desc = could not find container \"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\": container with ID starting with 80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.718393 4837 scope.go:117] "RemoveContainer" containerID="7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.718663 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7"} err="failed to get container status \"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\": rpc error: code = NotFound desc = could not find container \"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\": container with ID starting with 7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.718692 4837 scope.go:117] "RemoveContainer" containerID="954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.719074 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793"} err="failed to get container status \"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\": rpc error: code = NotFound desc = could not find container \"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\": container with ID starting with 954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.719099 4837 scope.go:117] "RemoveContainer" containerID="bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.719382 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2"} err="failed to get container status \"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\": rpc error: code = NotFound desc = could not find container \"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\": container with ID starting with bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.719407 4837 scope.go:117] "RemoveContainer" containerID="c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.720683 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d"} err="failed to get container status \"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\": rpc error: code = NotFound desc = could not find container \"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\": container with ID starting with c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.720718 4837 scope.go:117] "RemoveContainer" containerID="b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.721061 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1"} err="failed to get container status \"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\": rpc error: code = NotFound desc = could not find container \"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\": container with ID starting with b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.721105 4837 scope.go:117] "RemoveContainer" containerID="4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.721417 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60"} err="failed to get container status \"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\": rpc error: code = NotFound desc = could not find container \"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\": container with ID starting with 4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.721453 4837 scope.go:117] "RemoveContainer" containerID="f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.721729 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282"} err="failed to get container status \"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282\": rpc error: code = NotFound desc = could not find container \"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282\": container with ID starting with f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.721752 4837 scope.go:117] "RemoveContainer" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.721999 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c"} err="failed to get container status \"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\": rpc error: code = NotFound desc = could not find container \"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\": container with ID starting with 01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.722029 4837 scope.go:117] "RemoveContainer" containerID="60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.722255 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a"} err="failed to get container status \"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\": rpc error: code = NotFound desc = could not find container \"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\": container with ID starting with 60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.722284 4837 scope.go:117] "RemoveContainer" containerID="80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.722577 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39"} err="failed to get container status \"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\": rpc error: code = NotFound desc = could not find container \"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\": container with ID starting with 80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.722652 4837 scope.go:117] "RemoveContainer" containerID="7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.722920 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7"} err="failed to get container status \"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\": rpc error: code = NotFound desc = could not find container \"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\": container with ID starting with 7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.722945 4837 scope.go:117] "RemoveContainer" containerID="954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.723145 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793"} err="failed to get container status \"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\": rpc error: code = NotFound desc = could not find container \"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\": container with ID starting with 954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.723191 4837 scope.go:117] "RemoveContainer" containerID="bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.723434 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2"} err="failed to get container status \"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\": rpc error: code = NotFound desc = could not find container \"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\": container with ID starting with bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.723456 4837 scope.go:117] "RemoveContainer" containerID="c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.723609 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d"} err="failed to get container status \"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\": rpc error: code = NotFound desc = could not find container \"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\": container with ID starting with c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.723627 4837 scope.go:117] "RemoveContainer" containerID="b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.723840 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1"} err="failed to get container status \"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\": rpc error: code = NotFound desc = could not find container \"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\": container with ID starting with b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.723875 4837 scope.go:117] "RemoveContainer" containerID="4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.724066 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60"} err="failed to get container status \"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\": rpc error: code = NotFound desc = could not find container \"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\": container with ID starting with 4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60 not found: ID does not exist" Mar 13 11:59:52 crc kubenswrapper[4837]: I0313 11:59:52.358322 4837 generic.go:334] "Generic (PLEG): container finished" podID="7b564b0f-ab5a-454b-8588-a645fdec0058" containerID="bb5ee5517ad5a43762ab2251e410f5318f82ec2a81d734f67f2f3182b5ffbaac" exitCode=0 Mar 13 11:59:52 crc kubenswrapper[4837]: I0313 11:59:52.358424 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" event={"ID":"7b564b0f-ab5a-454b-8588-a645fdec0058","Type":"ContainerDied","Data":"bb5ee5517ad5a43762ab2251e410f5318f82ec2a81d734f67f2f3182b5ffbaac"} Mar 13 11:59:52 crc kubenswrapper[4837]: I0313 11:59:52.360019 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" event={"ID":"7b564b0f-ab5a-454b-8588-a645fdec0058","Type":"ContainerStarted","Data":"6afb7fa965f127a6881f0f6df8d1b6b9e17a876a7592677e4d80c493ed85fc49"} Mar 13 11:59:52 crc kubenswrapper[4837]: I0313 11:59:52.364468 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qg957_cbb3f4c6-a6c5-4059-8beb-04179d70aff5/kube-multus/2.log" Mar 13 11:59:53 crc kubenswrapper[4837]: I0313 11:59:53.056088 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" path="/var/lib/kubelet/pods/43df29f7-1351-41f5-bfca-17f804837cb4/volumes" Mar 13 11:59:53 crc kubenswrapper[4837]: I0313 11:59:53.373291 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" event={"ID":"7b564b0f-ab5a-454b-8588-a645fdec0058","Type":"ContainerStarted","Data":"92962e881a495a2e0ac4153f3505318b6a007e7e8b0cc140b0f3ba578e6d7723"} Mar 13 11:59:53 crc kubenswrapper[4837]: I0313 11:59:53.373329 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" event={"ID":"7b564b0f-ab5a-454b-8588-a645fdec0058","Type":"ContainerStarted","Data":"866307818a57731c0c4ee24805a1470f96af533343d809502d6e4e2525011118"} Mar 13 11:59:53 crc kubenswrapper[4837]: I0313 11:59:53.373343 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" event={"ID":"7b564b0f-ab5a-454b-8588-a645fdec0058","Type":"ContainerStarted","Data":"1ffd37843b0efe2597117fe6f66f589d3258198d3a2d361ff5fc4bbc1d55a53e"} Mar 13 11:59:53 crc kubenswrapper[4837]: I0313 11:59:53.373354 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" event={"ID":"7b564b0f-ab5a-454b-8588-a645fdec0058","Type":"ContainerStarted","Data":"6d035b72086b8923cb3e9c835c1d7ab969f81dac5d95be31c196c38feb879837"} Mar 13 11:59:53 crc kubenswrapper[4837]: I0313 11:59:53.373363 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" event={"ID":"7b564b0f-ab5a-454b-8588-a645fdec0058","Type":"ContainerStarted","Data":"19e6a3f68d9f047b7f0c6cf4e1627a27f8a817a282970d9795805df7cd12052f"} Mar 13 11:59:53 crc kubenswrapper[4837]: I0313 11:59:53.373370 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" event={"ID":"7b564b0f-ab5a-454b-8588-a645fdec0058","Type":"ContainerStarted","Data":"896ceaa574226ecb36e0da9d7fba5e511a2dd2595dbf5da03f24f83259009ea4"} Mar 13 11:59:55 crc kubenswrapper[4837]: I0313 11:59:55.390332 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" event={"ID":"7b564b0f-ab5a-454b-8588-a645fdec0058","Type":"ContainerStarted","Data":"6dde7ba59f89410c93c1507806ee885ec11a606cc8922ccd36d5c63c161a3f8a"} Mar 13 11:59:58 crc kubenswrapper[4837]: I0313 11:59:58.416606 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" event={"ID":"7b564b0f-ab5a-454b-8588-a645fdec0058","Type":"ContainerStarted","Data":"03d8f6aca9a68604183217b6f548ae221326d4cdb9fd9cba920c5ad2cf17b2a4"} Mar 13 11:59:58 crc kubenswrapper[4837]: I0313 11:59:58.417182 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:58 crc kubenswrapper[4837]: I0313 11:59:58.417204 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:58 crc kubenswrapper[4837]: I0313 11:59:58.446368 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:58 crc kubenswrapper[4837]: I0313 11:59:58.460927 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" podStartSLOduration=7.460909185 podStartE2EDuration="7.460909185s" podCreationTimestamp="2026-03-13 11:59:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:59:58.457441174 +0000 UTC m=+714.095707937" watchObservedRunningTime="2026-03-13 11:59:58.460909185 +0000 UTC m=+714.099175958" Mar 13 11:59:59 crc kubenswrapper[4837]: I0313 11:59:59.422596 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:59 crc kubenswrapper[4837]: I0313 11:59:59.448350 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.133875 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556720-wqrqr"] Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.134622 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.137703 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.137786 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.137897 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.145424 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556720-wqrqr"] Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.237127 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc"] Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.237978 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.241257 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.243936 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc"] Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.244131 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fflv4\" (UniqueName: \"kubernetes.io/projected/1335d65b-c0fb-4085-86eb-d948f797ef68-kube-api-access-fflv4\") pod \"auto-csr-approver-29556720-wqrqr\" (UID: \"1335d65b-c0fb-4085-86eb-d948f797ef68\") " pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.244469 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.345438 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fflv4\" (UniqueName: \"kubernetes.io/projected/1335d65b-c0fb-4085-86eb-d948f797ef68-kube-api-access-fflv4\") pod \"auto-csr-approver-29556720-wqrqr\" (UID: \"1335d65b-c0fb-4085-86eb-d948f797ef68\") " pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.345486 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6d18151-32fe-4457-814f-33c3ed53dab8-secret-volume\") pod \"collect-profiles-29556720-slvpc\" (UID: \"a6d18151-32fe-4457-814f-33c3ed53dab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.345511 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfjfp\" (UniqueName: \"kubernetes.io/projected/a6d18151-32fe-4457-814f-33c3ed53dab8-kube-api-access-zfjfp\") pod \"collect-profiles-29556720-slvpc\" (UID: \"a6d18151-32fe-4457-814f-33c3ed53dab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.345556 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6d18151-32fe-4457-814f-33c3ed53dab8-config-volume\") pod \"collect-profiles-29556720-slvpc\" (UID: \"a6d18151-32fe-4457-814f-33c3ed53dab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.365424 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fflv4\" (UniqueName: \"kubernetes.io/projected/1335d65b-c0fb-4085-86eb-d948f797ef68-kube-api-access-fflv4\") pod \"auto-csr-approver-29556720-wqrqr\" (UID: \"1335d65b-c0fb-4085-86eb-d948f797ef68\") " pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.446497 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6d18151-32fe-4457-814f-33c3ed53dab8-config-volume\") pod \"collect-profiles-29556720-slvpc\" (UID: \"a6d18151-32fe-4457-814f-33c3ed53dab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.447344 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6d18151-32fe-4457-814f-33c3ed53dab8-config-volume\") pod \"collect-profiles-29556720-slvpc\" (UID: \"a6d18151-32fe-4457-814f-33c3ed53dab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.447793 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6d18151-32fe-4457-814f-33c3ed53dab8-secret-volume\") pod \"collect-profiles-29556720-slvpc\" (UID: \"a6d18151-32fe-4457-814f-33c3ed53dab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.447982 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfjfp\" (UniqueName: \"kubernetes.io/projected/a6d18151-32fe-4457-814f-33c3ed53dab8-kube-api-access-zfjfp\") pod \"collect-profiles-29556720-slvpc\" (UID: \"a6d18151-32fe-4457-814f-33c3ed53dab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.452274 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6d18151-32fe-4457-814f-33c3ed53dab8-secret-volume\") pod \"collect-profiles-29556720-slvpc\" (UID: \"a6d18151-32fe-4457-814f-33c3ed53dab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.458900 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.464209 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfjfp\" (UniqueName: \"kubernetes.io/projected/a6d18151-32fe-4457-814f-33c3ed53dab8-kube-api-access-zfjfp\") pod \"collect-profiles-29556720-slvpc\" (UID: \"a6d18151-32fe-4457-814f-33c3ed53dab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: E0313 12:00:00.487772 4837 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29556720-wqrqr_openshift-infra_1335d65b-c0fb-4085-86eb-d948f797ef68_0(3b42cbee7461239d7eda46c0d8702483566a37f1f91506d382b292950e70f220): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 12:00:00 crc kubenswrapper[4837]: E0313 12:00:00.487875 4837 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29556720-wqrqr_openshift-infra_1335d65b-c0fb-4085-86eb-d948f797ef68_0(3b42cbee7461239d7eda46c0d8702483566a37f1f91506d382b292950e70f220): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:00 crc kubenswrapper[4837]: E0313 12:00:00.487912 4837 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29556720-wqrqr_openshift-infra_1335d65b-c0fb-4085-86eb-d948f797ef68_0(3b42cbee7461239d7eda46c0d8702483566a37f1f91506d382b292950e70f220): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:00 crc kubenswrapper[4837]: E0313 12:00:00.487988 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29556720-wqrqr_openshift-infra(1335d65b-c0fb-4085-86eb-d948f797ef68)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29556720-wqrqr_openshift-infra(1335d65b-c0fb-4085-86eb-d948f797ef68)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29556720-wqrqr_openshift-infra_1335d65b-c0fb-4085-86eb-d948f797ef68_0(3b42cbee7461239d7eda46c0d8702483566a37f1f91506d382b292950e70f220): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" podUID="1335d65b-c0fb-4085-86eb-d948f797ef68" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.552697 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: E0313 12:00:00.576759 4837 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager_a6d18151-32fe-4457-814f-33c3ed53dab8_0(0dd387fe97dac6ebba71b653e34d9c2de468b5f529efcc97f6ac0d3f47cdbb41): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 12:00:00 crc kubenswrapper[4837]: E0313 12:00:00.576834 4837 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager_a6d18151-32fe-4457-814f-33c3ed53dab8_0(0dd387fe97dac6ebba71b653e34d9c2de468b5f529efcc97f6ac0d3f47cdbb41): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: E0313 12:00:00.576863 4837 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager_a6d18151-32fe-4457-814f-33c3ed53dab8_0(0dd387fe97dac6ebba71b653e34d9c2de468b5f529efcc97f6ac0d3f47cdbb41): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: E0313 12:00:00.576927 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager(a6d18151-32fe-4457-814f-33c3ed53dab8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager(a6d18151-32fe-4457-814f-33c3ed53dab8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager_a6d18151-32fe-4457-814f-33c3ed53dab8_0(0dd387fe97dac6ebba71b653e34d9c2de468b5f529efcc97f6ac0d3f47cdbb41): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" podUID="a6d18151-32fe-4457-814f-33c3ed53dab8" Mar 13 12:00:01 crc kubenswrapper[4837]: I0313 12:00:01.432538 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:01 crc kubenswrapper[4837]: I0313 12:00:01.432597 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:01 crc kubenswrapper[4837]: I0313 12:00:01.433136 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:01 crc kubenswrapper[4837]: I0313 12:00:01.433256 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:01 crc kubenswrapper[4837]: E0313 12:00:01.461035 4837 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29556720-wqrqr_openshift-infra_1335d65b-c0fb-4085-86eb-d948f797ef68_0(4eb7e214cd1b7a56654cd4d7e82c2556d3e79d5da56a054f1b205ee2a1f2bc6a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 12:00:01 crc kubenswrapper[4837]: E0313 12:00:01.461097 4837 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29556720-wqrqr_openshift-infra_1335d65b-c0fb-4085-86eb-d948f797ef68_0(4eb7e214cd1b7a56654cd4d7e82c2556d3e79d5da56a054f1b205ee2a1f2bc6a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:01 crc kubenswrapper[4837]: E0313 12:00:01.461117 4837 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29556720-wqrqr_openshift-infra_1335d65b-c0fb-4085-86eb-d948f797ef68_0(4eb7e214cd1b7a56654cd4d7e82c2556d3e79d5da56a054f1b205ee2a1f2bc6a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:01 crc kubenswrapper[4837]: E0313 12:00:01.461163 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29556720-wqrqr_openshift-infra(1335d65b-c0fb-4085-86eb-d948f797ef68)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29556720-wqrqr_openshift-infra(1335d65b-c0fb-4085-86eb-d948f797ef68)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29556720-wqrqr_openshift-infra_1335d65b-c0fb-4085-86eb-d948f797ef68_0(4eb7e214cd1b7a56654cd4d7e82c2556d3e79d5da56a054f1b205ee2a1f2bc6a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" podUID="1335d65b-c0fb-4085-86eb-d948f797ef68" Mar 13 12:00:01 crc kubenswrapper[4837]: E0313 12:00:01.466763 4837 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager_a6d18151-32fe-4457-814f-33c3ed53dab8_0(57f223134f24ddc5890336e84be957881b1f2e52ef9d200b0157237c48d1f945): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 12:00:01 crc kubenswrapper[4837]: E0313 12:00:01.466811 4837 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager_a6d18151-32fe-4457-814f-33c3ed53dab8_0(57f223134f24ddc5890336e84be957881b1f2e52ef9d200b0157237c48d1f945): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:01 crc kubenswrapper[4837]: E0313 12:00:01.466834 4837 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager_a6d18151-32fe-4457-814f-33c3ed53dab8_0(57f223134f24ddc5890336e84be957881b1f2e52ef9d200b0157237c48d1f945): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:01 crc kubenswrapper[4837]: E0313 12:00:01.466888 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager(a6d18151-32fe-4457-814f-33c3ed53dab8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager(a6d18151-32fe-4457-814f-33c3ed53dab8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager_a6d18151-32fe-4457-814f-33c3ed53dab8_0(57f223134f24ddc5890336e84be957881b1f2e52ef9d200b0157237c48d1f945): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" podUID="a6d18151-32fe-4457-814f-33c3ed53dab8" Mar 13 12:00:04 crc kubenswrapper[4837]: I0313 12:00:04.047821 4837 scope.go:117] "RemoveContainer" containerID="1effae1c86d3c4f5369295262f269b1dad692c561321e1c868d2b4fe7f736d7c" Mar 13 12:00:04 crc kubenswrapper[4837]: E0313 12:00:04.048313 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-qg957_openshift-multus(cbb3f4c6-a6c5-4059-8beb-04179d70aff5)\"" pod="openshift-multus/multus-qg957" podUID="cbb3f4c6-a6c5-4059-8beb-04179d70aff5" Mar 13 12:00:14 crc kubenswrapper[4837]: I0313 12:00:14.047779 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:14 crc kubenswrapper[4837]: I0313 12:00:14.049244 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:14 crc kubenswrapper[4837]: E0313 12:00:14.097827 4837 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager_a6d18151-32fe-4457-814f-33c3ed53dab8_0(3ccb157306fc69ff6691b77cd3a42f47ded828fe96e3883eab7134165784595a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 12:00:14 crc kubenswrapper[4837]: E0313 12:00:14.098164 4837 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager_a6d18151-32fe-4457-814f-33c3ed53dab8_0(3ccb157306fc69ff6691b77cd3a42f47ded828fe96e3883eab7134165784595a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:14 crc kubenswrapper[4837]: E0313 12:00:14.098183 4837 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager_a6d18151-32fe-4457-814f-33c3ed53dab8_0(3ccb157306fc69ff6691b77cd3a42f47ded828fe96e3883eab7134165784595a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:14 crc kubenswrapper[4837]: E0313 12:00:14.098226 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager(a6d18151-32fe-4457-814f-33c3ed53dab8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager(a6d18151-32fe-4457-814f-33c3ed53dab8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager_a6d18151-32fe-4457-814f-33c3ed53dab8_0(3ccb157306fc69ff6691b77cd3a42f47ded828fe96e3883eab7134165784595a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" podUID="a6d18151-32fe-4457-814f-33c3ed53dab8" Mar 13 12:00:16 crc kubenswrapper[4837]: I0313 12:00:16.047686 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:16 crc kubenswrapper[4837]: I0313 12:00:16.049120 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:16 crc kubenswrapper[4837]: E0313 12:00:16.083969 4837 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29556720-wqrqr_openshift-infra_1335d65b-c0fb-4085-86eb-d948f797ef68_0(b313d1c76170bd48b4bbcc367ae0fed8f9e48566cf1dfcb70b7a29f437fb08e5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 12:00:16 crc kubenswrapper[4837]: E0313 12:00:16.084103 4837 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29556720-wqrqr_openshift-infra_1335d65b-c0fb-4085-86eb-d948f797ef68_0(b313d1c76170bd48b4bbcc367ae0fed8f9e48566cf1dfcb70b7a29f437fb08e5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:16 crc kubenswrapper[4837]: E0313 12:00:16.084141 4837 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29556720-wqrqr_openshift-infra_1335d65b-c0fb-4085-86eb-d948f797ef68_0(b313d1c76170bd48b4bbcc367ae0fed8f9e48566cf1dfcb70b7a29f437fb08e5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:16 crc kubenswrapper[4837]: E0313 12:00:16.084211 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29556720-wqrqr_openshift-infra(1335d65b-c0fb-4085-86eb-d948f797ef68)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29556720-wqrqr_openshift-infra(1335d65b-c0fb-4085-86eb-d948f797ef68)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29556720-wqrqr_openshift-infra_1335d65b-c0fb-4085-86eb-d948f797ef68_0(b313d1c76170bd48b4bbcc367ae0fed8f9e48566cf1dfcb70b7a29f437fb08e5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" podUID="1335d65b-c0fb-4085-86eb-d948f797ef68" Mar 13 12:00:17 crc kubenswrapper[4837]: I0313 12:00:17.048053 4837 scope.go:117] "RemoveContainer" containerID="1effae1c86d3c4f5369295262f269b1dad692c561321e1c868d2b4fe7f736d7c" Mar 13 12:00:17 crc kubenswrapper[4837]: I0313 12:00:17.518226 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qg957_cbb3f4c6-a6c5-4059-8beb-04179d70aff5/kube-multus/2.log" Mar 13 12:00:17 crc kubenswrapper[4837]: I0313 12:00:17.519878 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qg957" event={"ID":"cbb3f4c6-a6c5-4059-8beb-04179d70aff5","Type":"ContainerStarted","Data":"af2a6c239ad0d8b155fd9808f142bbb42034d2d57141d3abc86f61d28daa588e"} Mar 13 12:00:21 crc kubenswrapper[4837]: I0313 12:00:21.529208 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.556591 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h"] Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.558107 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.560923 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.571747 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h"] Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.575539 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h\" (UID: \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.575573 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh7sc\" (UniqueName: \"kubernetes.io/projected/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-kube-api-access-hh7sc\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h\" (UID: \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.575610 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h\" (UID: \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.676616 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h\" (UID: \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.676684 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh7sc\" (UniqueName: \"kubernetes.io/projected/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-kube-api-access-hh7sc\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h\" (UID: \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.676734 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h\" (UID: \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.677283 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h\" (UID: \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.677329 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h\" (UID: \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.696241 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh7sc\" (UniqueName: \"kubernetes.io/projected/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-kube-api-access-hh7sc\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h\" (UID: \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.874901 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:27 crc kubenswrapper[4837]: I0313 12:00:27.047991 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:27 crc kubenswrapper[4837]: I0313 12:00:27.048746 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:27 crc kubenswrapper[4837]: I0313 12:00:27.131247 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h"] Mar 13 12:00:27 crc kubenswrapper[4837]: I0313 12:00:27.268012 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc"] Mar 13 12:00:27 crc kubenswrapper[4837]: W0313 12:00:27.271345 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6d18151_32fe_4457_814f_33c3ed53dab8.slice/crio-e11c2e11328d466588f9e5269543514fe5ebff4bbda4de6454f2bd5db61ff79e WatchSource:0}: Error finding container e11c2e11328d466588f9e5269543514fe5ebff4bbda4de6454f2bd5db61ff79e: Status 404 returned error can't find the container with id e11c2e11328d466588f9e5269543514fe5ebff4bbda4de6454f2bd5db61ff79e Mar 13 12:00:27 crc kubenswrapper[4837]: I0313 12:00:27.574951 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" event={"ID":"c49e70e5-a4f6-4782-aa38-2faeb20ec38a","Type":"ContainerStarted","Data":"fd40f2ee3c46e2914f7c79e21fff2402975f3e74643fb0fdb37eea494930a16a"} Mar 13 12:00:27 crc kubenswrapper[4837]: I0313 12:00:27.575247 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" event={"ID":"c49e70e5-a4f6-4782-aa38-2faeb20ec38a","Type":"ContainerStarted","Data":"197d1987346ba3a53d3f2e66c5ace726d54ba6e0c9cc65dbb51ca5434993db91"} Mar 13 12:00:27 crc kubenswrapper[4837]: I0313 12:00:27.577629 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" event={"ID":"a6d18151-32fe-4457-814f-33c3ed53dab8","Type":"ContainerStarted","Data":"2d2bfd751903359f1fbdf915afe9614d288e33b823b0215d4cd3578202f69f1c"} Mar 13 12:00:27 crc kubenswrapper[4837]: I0313 12:00:27.577696 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" event={"ID":"a6d18151-32fe-4457-814f-33c3ed53dab8","Type":"ContainerStarted","Data":"e11c2e11328d466588f9e5269543514fe5ebff4bbda4de6454f2bd5db61ff79e"} Mar 13 12:00:27 crc kubenswrapper[4837]: I0313 12:00:27.611743 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" podStartSLOduration=27.611701705 podStartE2EDuration="27.611701705s" podCreationTimestamp="2026-03-13 12:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:00:27.607397188 +0000 UTC m=+743.245663971" watchObservedRunningTime="2026-03-13 12:00:27.611701705 +0000 UTC m=+743.249968488" Mar 13 12:00:28 crc kubenswrapper[4837]: I0313 12:00:28.047718 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:28 crc kubenswrapper[4837]: I0313 12:00:28.048374 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:28 crc kubenswrapper[4837]: I0313 12:00:28.228842 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556720-wqrqr"] Mar 13 12:00:28 crc kubenswrapper[4837]: W0313 12:00:28.235937 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1335d65b_c0fb_4085_86eb_d948f797ef68.slice/crio-dc09c091aad8a901e4525c66559bbe36864c7168b149b94e87403d64c4f1e9a6 WatchSource:0}: Error finding container dc09c091aad8a901e4525c66559bbe36864c7168b149b94e87403d64c4f1e9a6: Status 404 returned error can't find the container with id dc09c091aad8a901e4525c66559bbe36864c7168b149b94e87403d64c4f1e9a6 Mar 13 12:00:28 crc kubenswrapper[4837]: I0313 12:00:28.584793 4837 generic.go:334] "Generic (PLEG): container finished" podID="c49e70e5-a4f6-4782-aa38-2faeb20ec38a" containerID="fd40f2ee3c46e2914f7c79e21fff2402975f3e74643fb0fdb37eea494930a16a" exitCode=0 Mar 13 12:00:28 crc kubenswrapper[4837]: I0313 12:00:28.584873 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" event={"ID":"c49e70e5-a4f6-4782-aa38-2faeb20ec38a","Type":"ContainerDied","Data":"fd40f2ee3c46e2914f7c79e21fff2402975f3e74643fb0fdb37eea494930a16a"} Mar 13 12:00:28 crc kubenswrapper[4837]: I0313 12:00:28.586330 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" event={"ID":"1335d65b-c0fb-4085-86eb-d948f797ef68","Type":"ContainerStarted","Data":"dc09c091aad8a901e4525c66559bbe36864c7168b149b94e87403d64c4f1e9a6"} Mar 13 12:00:28 crc kubenswrapper[4837]: I0313 12:00:28.589738 4837 generic.go:334] "Generic (PLEG): container finished" podID="a6d18151-32fe-4457-814f-33c3ed53dab8" containerID="2d2bfd751903359f1fbdf915afe9614d288e33b823b0215d4cd3578202f69f1c" exitCode=0 Mar 13 12:00:28 crc kubenswrapper[4837]: I0313 12:00:28.589789 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" event={"ID":"a6d18151-32fe-4457-814f-33c3ed53dab8","Type":"ContainerDied","Data":"2d2bfd751903359f1fbdf915afe9614d288e33b823b0215d4cd3578202f69f1c"} Mar 13 12:00:29 crc kubenswrapper[4837]: I0313 12:00:29.596918 4837 generic.go:334] "Generic (PLEG): container finished" podID="1335d65b-c0fb-4085-86eb-d948f797ef68" containerID="1a04d5901dd1375cafd0fc584ce462f13000b8c9b02a1c2603aedb866420cd51" exitCode=0 Mar 13 12:00:29 crc kubenswrapper[4837]: I0313 12:00:29.597822 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" event={"ID":"1335d65b-c0fb-4085-86eb-d948f797ef68","Type":"ContainerDied","Data":"1a04d5901dd1375cafd0fc584ce462f13000b8c9b02a1c2603aedb866420cd51"} Mar 13 12:00:29 crc kubenswrapper[4837]: I0313 12:00:29.810427 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:29 crc kubenswrapper[4837]: I0313 12:00:29.930078 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfjfp\" (UniqueName: \"kubernetes.io/projected/a6d18151-32fe-4457-814f-33c3ed53dab8-kube-api-access-zfjfp\") pod \"a6d18151-32fe-4457-814f-33c3ed53dab8\" (UID: \"a6d18151-32fe-4457-814f-33c3ed53dab8\") " Mar 13 12:00:29 crc kubenswrapper[4837]: I0313 12:00:29.930200 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6d18151-32fe-4457-814f-33c3ed53dab8-config-volume\") pod \"a6d18151-32fe-4457-814f-33c3ed53dab8\" (UID: \"a6d18151-32fe-4457-814f-33c3ed53dab8\") " Mar 13 12:00:29 crc kubenswrapper[4837]: I0313 12:00:29.930265 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6d18151-32fe-4457-814f-33c3ed53dab8-secret-volume\") pod \"a6d18151-32fe-4457-814f-33c3ed53dab8\" (UID: \"a6d18151-32fe-4457-814f-33c3ed53dab8\") " Mar 13 12:00:29 crc kubenswrapper[4837]: I0313 12:00:29.931158 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6d18151-32fe-4457-814f-33c3ed53dab8-config-volume" (OuterVolumeSpecName: "config-volume") pod "a6d18151-32fe-4457-814f-33c3ed53dab8" (UID: "a6d18151-32fe-4457-814f-33c3ed53dab8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:00:29 crc kubenswrapper[4837]: I0313 12:00:29.935970 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6d18151-32fe-4457-814f-33c3ed53dab8-kube-api-access-zfjfp" (OuterVolumeSpecName: "kube-api-access-zfjfp") pod "a6d18151-32fe-4457-814f-33c3ed53dab8" (UID: "a6d18151-32fe-4457-814f-33c3ed53dab8"). InnerVolumeSpecName "kube-api-access-zfjfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:00:29 crc kubenswrapper[4837]: I0313 12:00:29.936078 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d18151-32fe-4457-814f-33c3ed53dab8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a6d18151-32fe-4457-814f-33c3ed53dab8" (UID: "a6d18151-32fe-4457-814f-33c3ed53dab8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:00:30 crc kubenswrapper[4837]: I0313 12:00:30.032258 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfjfp\" (UniqueName: \"kubernetes.io/projected/a6d18151-32fe-4457-814f-33c3ed53dab8-kube-api-access-zfjfp\") on node \"crc\" DevicePath \"\"" Mar 13 12:00:30 crc kubenswrapper[4837]: I0313 12:00:30.032296 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6d18151-32fe-4457-814f-33c3ed53dab8-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 12:00:30 crc kubenswrapper[4837]: I0313 12:00:30.032305 4837 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6d18151-32fe-4457-814f-33c3ed53dab8-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 12:00:30 crc kubenswrapper[4837]: I0313 12:00:30.608506 4837 generic.go:334] "Generic (PLEG): container finished" podID="c49e70e5-a4f6-4782-aa38-2faeb20ec38a" containerID="98d1e6ef469c7b5964bf18569c9c28706c631a5559be2f9d43b97d26249a2d7c" exitCode=0 Mar 13 12:00:30 crc kubenswrapper[4837]: I0313 12:00:30.608573 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" event={"ID":"c49e70e5-a4f6-4782-aa38-2faeb20ec38a","Type":"ContainerDied","Data":"98d1e6ef469c7b5964bf18569c9c28706c631a5559be2f9d43b97d26249a2d7c"} Mar 13 12:00:30 crc kubenswrapper[4837]: I0313 12:00:30.612589 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:30 crc kubenswrapper[4837]: I0313 12:00:30.613853 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" event={"ID":"a6d18151-32fe-4457-814f-33c3ed53dab8","Type":"ContainerDied","Data":"e11c2e11328d466588f9e5269543514fe5ebff4bbda4de6454f2bd5db61ff79e"} Mar 13 12:00:30 crc kubenswrapper[4837]: I0313 12:00:30.613913 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e11c2e11328d466588f9e5269543514fe5ebff4bbda4de6454f2bd5db61ff79e" Mar 13 12:00:30 crc kubenswrapper[4837]: I0313 12:00:30.835124 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:30 crc kubenswrapper[4837]: I0313 12:00:30.944574 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fflv4\" (UniqueName: \"kubernetes.io/projected/1335d65b-c0fb-4085-86eb-d948f797ef68-kube-api-access-fflv4\") pod \"1335d65b-c0fb-4085-86eb-d948f797ef68\" (UID: \"1335d65b-c0fb-4085-86eb-d948f797ef68\") " Mar 13 12:00:30 crc kubenswrapper[4837]: I0313 12:00:30.949477 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1335d65b-c0fb-4085-86eb-d948f797ef68-kube-api-access-fflv4" (OuterVolumeSpecName: "kube-api-access-fflv4") pod "1335d65b-c0fb-4085-86eb-d948f797ef68" (UID: "1335d65b-c0fb-4085-86eb-d948f797ef68"). InnerVolumeSpecName "kube-api-access-fflv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:00:31 crc kubenswrapper[4837]: I0313 12:00:31.045763 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fflv4\" (UniqueName: \"kubernetes.io/projected/1335d65b-c0fb-4085-86eb-d948f797ef68-kube-api-access-fflv4\") on node \"crc\" DevicePath \"\"" Mar 13 12:00:31 crc kubenswrapper[4837]: I0313 12:00:31.622341 4837 generic.go:334] "Generic (PLEG): container finished" podID="c49e70e5-a4f6-4782-aa38-2faeb20ec38a" containerID="518a2132bb0b0d605b34515994dcb95f1f8ab534bc9ae285b7a96e1e9d3840e5" exitCode=0 Mar 13 12:00:31 crc kubenswrapper[4837]: I0313 12:00:31.622410 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" event={"ID":"c49e70e5-a4f6-4782-aa38-2faeb20ec38a","Type":"ContainerDied","Data":"518a2132bb0b0d605b34515994dcb95f1f8ab534bc9ae285b7a96e1e9d3840e5"} Mar 13 12:00:31 crc kubenswrapper[4837]: I0313 12:00:31.624992 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" event={"ID":"1335d65b-c0fb-4085-86eb-d948f797ef68","Type":"ContainerDied","Data":"dc09c091aad8a901e4525c66559bbe36864c7168b149b94e87403d64c4f1e9a6"} Mar 13 12:00:31 crc kubenswrapper[4837]: I0313 12:00:31.625026 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc09c091aad8a901e4525c66559bbe36864c7168b149b94e87403d64c4f1e9a6" Mar 13 12:00:31 crc kubenswrapper[4837]: I0313 12:00:31.625074 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:31 crc kubenswrapper[4837]: I0313 12:00:31.909567 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556714-jzzgx"] Mar 13 12:00:31 crc kubenswrapper[4837]: I0313 12:00:31.913602 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556714-jzzgx"] Mar 13 12:00:32 crc kubenswrapper[4837]: I0313 12:00:32.867505 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:32 crc kubenswrapper[4837]: I0313 12:00:32.968542 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-util\") pod \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\" (UID: \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\") " Mar 13 12:00:32 crc kubenswrapper[4837]: I0313 12:00:32.968698 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh7sc\" (UniqueName: \"kubernetes.io/projected/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-kube-api-access-hh7sc\") pod \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\" (UID: \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\") " Mar 13 12:00:32 crc kubenswrapper[4837]: I0313 12:00:32.968828 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-bundle\") pod \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\" (UID: \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\") " Mar 13 12:00:32 crc kubenswrapper[4837]: I0313 12:00:32.969925 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-bundle" (OuterVolumeSpecName: "bundle") pod "c49e70e5-a4f6-4782-aa38-2faeb20ec38a" (UID: "c49e70e5-a4f6-4782-aa38-2faeb20ec38a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:00:32 crc kubenswrapper[4837]: I0313 12:00:32.976523 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-kube-api-access-hh7sc" (OuterVolumeSpecName: "kube-api-access-hh7sc") pod "c49e70e5-a4f6-4782-aa38-2faeb20ec38a" (UID: "c49e70e5-a4f6-4782-aa38-2faeb20ec38a"). InnerVolumeSpecName "kube-api-access-hh7sc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:00:32 crc kubenswrapper[4837]: I0313 12:00:32.993368 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-util" (OuterVolumeSpecName: "util") pod "c49e70e5-a4f6-4782-aa38-2faeb20ec38a" (UID: "c49e70e5-a4f6-4782-aa38-2faeb20ec38a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:00:33 crc kubenswrapper[4837]: I0313 12:00:33.058845 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b7a269a-3d94-4758-922d-9886312f2a25" path="/var/lib/kubelet/pods/2b7a269a-3d94-4758-922d-9886312f2a25/volumes" Mar 13 12:00:33 crc kubenswrapper[4837]: I0313 12:00:33.070522 4837 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:00:33 crc kubenswrapper[4837]: I0313 12:00:33.070556 4837 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-util\") on node \"crc\" DevicePath \"\"" Mar 13 12:00:33 crc kubenswrapper[4837]: I0313 12:00:33.070566 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh7sc\" (UniqueName: \"kubernetes.io/projected/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-kube-api-access-hh7sc\") on node \"crc\" DevicePath \"\"" Mar 13 12:00:33 crc kubenswrapper[4837]: I0313 12:00:33.640900 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" event={"ID":"c49e70e5-a4f6-4782-aa38-2faeb20ec38a","Type":"ContainerDied","Data":"197d1987346ba3a53d3f2e66c5ace726d54ba6e0c9cc65dbb51ca5434993db91"} Mar 13 12:00:33 crc kubenswrapper[4837]: I0313 12:00:33.640966 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="197d1987346ba3a53d3f2e66c5ace726d54ba6e0c9cc65dbb51ca5434993db91" Mar 13 12:00:33 crc kubenswrapper[4837]: I0313 12:00:33.641064 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.112877 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-zf78q"] Mar 13 12:00:38 crc kubenswrapper[4837]: E0313 12:00:38.113586 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c49e70e5-a4f6-4782-aa38-2faeb20ec38a" containerName="util" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.113600 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c49e70e5-a4f6-4782-aa38-2faeb20ec38a" containerName="util" Mar 13 12:00:38 crc kubenswrapper[4837]: E0313 12:00:38.113613 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c49e70e5-a4f6-4782-aa38-2faeb20ec38a" containerName="extract" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.113618 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c49e70e5-a4f6-4782-aa38-2faeb20ec38a" containerName="extract" Mar 13 12:00:38 crc kubenswrapper[4837]: E0313 12:00:38.113631 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c49e70e5-a4f6-4782-aa38-2faeb20ec38a" containerName="pull" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.113654 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c49e70e5-a4f6-4782-aa38-2faeb20ec38a" containerName="pull" Mar 13 12:00:38 crc kubenswrapper[4837]: E0313 12:00:38.113669 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1335d65b-c0fb-4085-86eb-d948f797ef68" containerName="oc" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.113674 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1335d65b-c0fb-4085-86eb-d948f797ef68" containerName="oc" Mar 13 12:00:38 crc kubenswrapper[4837]: E0313 12:00:38.113686 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6d18151-32fe-4457-814f-33c3ed53dab8" containerName="collect-profiles" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.113691 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d18151-32fe-4457-814f-33c3ed53dab8" containerName="collect-profiles" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.113776 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6d18151-32fe-4457-814f-33c3ed53dab8" containerName="collect-profiles" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.113788 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c49e70e5-a4f6-4782-aa38-2faeb20ec38a" containerName="extract" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.113799 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1335d65b-c0fb-4085-86eb-d948f797ef68" containerName="oc" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.114140 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zf78q" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.115997 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.116486 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-cfzms" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.116579 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.122424 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-zf78q"] Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.237301 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rqd6\" (UniqueName: \"kubernetes.io/projected/ef7096b9-861a-4889-9318-535c35151777-kube-api-access-9rqd6\") pod \"nmstate-operator-796d4cfff4-zf78q\" (UID: \"ef7096b9-861a-4889-9318-535c35151777\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-zf78q" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.338835 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rqd6\" (UniqueName: \"kubernetes.io/projected/ef7096b9-861a-4889-9318-535c35151777-kube-api-access-9rqd6\") pod \"nmstate-operator-796d4cfff4-zf78q\" (UID: \"ef7096b9-861a-4889-9318-535c35151777\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-zf78q" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.364305 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rqd6\" (UniqueName: \"kubernetes.io/projected/ef7096b9-861a-4889-9318-535c35151777-kube-api-access-9rqd6\") pod \"nmstate-operator-796d4cfff4-zf78q\" (UID: \"ef7096b9-861a-4889-9318-535c35151777\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-zf78q" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.429964 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zf78q" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.629573 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-zf78q"] Mar 13 12:00:38 crc kubenswrapper[4837]: W0313 12:00:38.638411 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef7096b9_861a_4889_9318_535c35151777.slice/crio-7f6edabb0070547e15deb962b244bbed98a0578ef1a7447be43c187885888f8f WatchSource:0}: Error finding container 7f6edabb0070547e15deb962b244bbed98a0578ef1a7447be43c187885888f8f: Status 404 returned error can't find the container with id 7f6edabb0070547e15deb962b244bbed98a0578ef1a7447be43c187885888f8f Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.672703 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zf78q" event={"ID":"ef7096b9-861a-4889-9318-535c35151777","Type":"ContainerStarted","Data":"7f6edabb0070547e15deb962b244bbed98a0578ef1a7447be43c187885888f8f"} Mar 13 12:00:41 crc kubenswrapper[4837]: I0313 12:00:41.690540 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zf78q" event={"ID":"ef7096b9-861a-4889-9318-535c35151777","Type":"ContainerStarted","Data":"b547724cd9bc88b4ef0a860c645ac542cb68b7437143e52f8ae4ff67ee817dc2"} Mar 13 12:00:41 crc kubenswrapper[4837]: I0313 12:00:41.707369 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zf78q" podStartSLOduration=1.197140837 podStartE2EDuration="3.707348569s" podCreationTimestamp="2026-03-13 12:00:38 +0000 UTC" firstStartedPulling="2026-03-13 12:00:38.640914777 +0000 UTC m=+754.279181550" lastFinishedPulling="2026-03-13 12:00:41.151122519 +0000 UTC m=+756.789389282" observedRunningTime="2026-03-13 12:00:41.705400206 +0000 UTC m=+757.343666989" watchObservedRunningTime="2026-03-13 12:00:41.707348569 +0000 UTC m=+757.345615332" Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.876371 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-8xzdk"] Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.877984 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8xzdk" Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.880231 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-m78qn" Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.888520 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-8xzdk"] Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.893184 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h"] Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.894037 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.896471 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.911250 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-vqqqz"] Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.912086 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.925794 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h"] Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.944147 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8h2h\" (UniqueName: \"kubernetes.io/projected/5d1f2d02-86ab-4679-a4e4-530ad37e4302-kube-api-access-m8h2h\") pod \"nmstate-metrics-9b8c8685d-8xzdk\" (UID: \"5d1f2d02-86ab-4679-a4e4-530ad37e4302\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8xzdk" Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.944245 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0b06c77a-f41d-41a6-b115-f12cc5109c0c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-6cx5h\" (UID: \"0b06c77a-f41d-41a6-b115-f12cc5109c0c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.944283 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4985t\" (UniqueName: \"kubernetes.io/projected/0b06c77a-f41d-41a6-b115-f12cc5109c0c-kube-api-access-4985t\") pod \"nmstate-webhook-5f558f5558-6cx5h\" (UID: \"0b06c77a-f41d-41a6-b115-f12cc5109c0c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.010182 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr"] Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.010841 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.012684 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.013827 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.014236 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-xpb8c" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.019597 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr"] Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.045513 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ebe31727-805d-472e-89d3-e99b11435be1-ovs-socket\") pod \"nmstate-handler-vqqqz\" (UID: \"ebe31727-805d-472e-89d3-e99b11435be1\") " pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.045555 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm2mf\" (UniqueName: \"kubernetes.io/projected/00b31b3f-b520-493a-ad26-679e09376e81-kube-api-access-cm2mf\") pod \"nmstate-console-plugin-86f58fcf4-fpxmr\" (UID: \"00b31b3f-b520-493a-ad26-679e09376e81\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.045587 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ebe31727-805d-472e-89d3-e99b11435be1-nmstate-lock\") pod \"nmstate-handler-vqqqz\" (UID: \"ebe31727-805d-472e-89d3-e99b11435be1\") " pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.045618 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8h2h\" (UniqueName: \"kubernetes.io/projected/5d1f2d02-86ab-4679-a4e4-530ad37e4302-kube-api-access-m8h2h\") pod \"nmstate-metrics-9b8c8685d-8xzdk\" (UID: \"5d1f2d02-86ab-4679-a4e4-530ad37e4302\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8xzdk" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.045650 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ebe31727-805d-472e-89d3-e99b11435be1-dbus-socket\") pod \"nmstate-handler-vqqqz\" (UID: \"ebe31727-805d-472e-89d3-e99b11435be1\") " pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.045791 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/00b31b3f-b520-493a-ad26-679e09376e81-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-fpxmr\" (UID: \"00b31b3f-b520-493a-ad26-679e09376e81\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.045873 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/00b31b3f-b520-493a-ad26-679e09376e81-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-fpxmr\" (UID: \"00b31b3f-b520-493a-ad26-679e09376e81\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.045922 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78lxd\" (UniqueName: \"kubernetes.io/projected/ebe31727-805d-472e-89d3-e99b11435be1-kube-api-access-78lxd\") pod \"nmstate-handler-vqqqz\" (UID: \"ebe31727-805d-472e-89d3-e99b11435be1\") " pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.045953 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0b06c77a-f41d-41a6-b115-f12cc5109c0c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-6cx5h\" (UID: \"0b06c77a-f41d-41a6-b115-f12cc5109c0c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.046005 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4985t\" (UniqueName: \"kubernetes.io/projected/0b06c77a-f41d-41a6-b115-f12cc5109c0c-kube-api-access-4985t\") pod \"nmstate-webhook-5f558f5558-6cx5h\" (UID: \"0b06c77a-f41d-41a6-b115-f12cc5109c0c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" Mar 13 12:00:47 crc kubenswrapper[4837]: E0313 12:00:47.046152 4837 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 13 12:00:47 crc kubenswrapper[4837]: E0313 12:00:47.046227 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b06c77a-f41d-41a6-b115-f12cc5109c0c-tls-key-pair podName:0b06c77a-f41d-41a6-b115-f12cc5109c0c nodeName:}" failed. No retries permitted until 2026-03-13 12:00:47.546208177 +0000 UTC m=+763.184474940 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/0b06c77a-f41d-41a6-b115-f12cc5109c0c-tls-key-pair") pod "nmstate-webhook-5f558f5558-6cx5h" (UID: "0b06c77a-f41d-41a6-b115-f12cc5109c0c") : secret "openshift-nmstate-webhook" not found Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.071130 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8h2h\" (UniqueName: \"kubernetes.io/projected/5d1f2d02-86ab-4679-a4e4-530ad37e4302-kube-api-access-m8h2h\") pod \"nmstate-metrics-9b8c8685d-8xzdk\" (UID: \"5d1f2d02-86ab-4679-a4e4-530ad37e4302\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8xzdk" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.078135 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4985t\" (UniqueName: \"kubernetes.io/projected/0b06c77a-f41d-41a6-b115-f12cc5109c0c-kube-api-access-4985t\") pod \"nmstate-webhook-5f558f5558-6cx5h\" (UID: \"0b06c77a-f41d-41a6-b115-f12cc5109c0c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.147242 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ebe31727-805d-472e-89d3-e99b11435be1-dbus-socket\") pod \"nmstate-handler-vqqqz\" (UID: \"ebe31727-805d-472e-89d3-e99b11435be1\") " pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.147292 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/00b31b3f-b520-493a-ad26-679e09376e81-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-fpxmr\" (UID: \"00b31b3f-b520-493a-ad26-679e09376e81\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.147342 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/00b31b3f-b520-493a-ad26-679e09376e81-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-fpxmr\" (UID: \"00b31b3f-b520-493a-ad26-679e09376e81\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.147361 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78lxd\" (UniqueName: \"kubernetes.io/projected/ebe31727-805d-472e-89d3-e99b11435be1-kube-api-access-78lxd\") pod \"nmstate-handler-vqqqz\" (UID: \"ebe31727-805d-472e-89d3-e99b11435be1\") " pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.147407 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ebe31727-805d-472e-89d3-e99b11435be1-ovs-socket\") pod \"nmstate-handler-vqqqz\" (UID: \"ebe31727-805d-472e-89d3-e99b11435be1\") " pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.147422 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm2mf\" (UniqueName: \"kubernetes.io/projected/00b31b3f-b520-493a-ad26-679e09376e81-kube-api-access-cm2mf\") pod \"nmstate-console-plugin-86f58fcf4-fpxmr\" (UID: \"00b31b3f-b520-493a-ad26-679e09376e81\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.147446 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ebe31727-805d-472e-89d3-e99b11435be1-nmstate-lock\") pod \"nmstate-handler-vqqqz\" (UID: \"ebe31727-805d-472e-89d3-e99b11435be1\") " pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.147682 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ebe31727-805d-472e-89d3-e99b11435be1-dbus-socket\") pod \"nmstate-handler-vqqqz\" (UID: \"ebe31727-805d-472e-89d3-e99b11435be1\") " pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.148207 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ebe31727-805d-472e-89d3-e99b11435be1-ovs-socket\") pod \"nmstate-handler-vqqqz\" (UID: \"ebe31727-805d-472e-89d3-e99b11435be1\") " pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.148451 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ebe31727-805d-472e-89d3-e99b11435be1-nmstate-lock\") pod \"nmstate-handler-vqqqz\" (UID: \"ebe31727-805d-472e-89d3-e99b11435be1\") " pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.149588 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/00b31b3f-b520-493a-ad26-679e09376e81-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-fpxmr\" (UID: \"00b31b3f-b520-493a-ad26-679e09376e81\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.154769 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/00b31b3f-b520-493a-ad26-679e09376e81-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-fpxmr\" (UID: \"00b31b3f-b520-493a-ad26-679e09376e81\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.172544 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78lxd\" (UniqueName: \"kubernetes.io/projected/ebe31727-805d-472e-89d3-e99b11435be1-kube-api-access-78lxd\") pod \"nmstate-handler-vqqqz\" (UID: \"ebe31727-805d-472e-89d3-e99b11435be1\") " pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.176569 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm2mf\" (UniqueName: \"kubernetes.io/projected/00b31b3f-b520-493a-ad26-679e09376e81-kube-api-access-cm2mf\") pod \"nmstate-console-plugin-86f58fcf4-fpxmr\" (UID: \"00b31b3f-b520-493a-ad26-679e09376e81\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.193162 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8xzdk" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.213230 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-854454756c-m4vqj"] Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.214074 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.226216 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-854454756c-m4vqj"] Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.232195 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.327301 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.350542 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a7ec137-20d8-418a-a85e-70034882f17b-console-oauth-config\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.350613 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a7ec137-20d8-418a-a85e-70034882f17b-service-ca\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.350665 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpd9s\" (UniqueName: \"kubernetes.io/projected/5a7ec137-20d8-418a-a85e-70034882f17b-kube-api-access-vpd9s\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.350701 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a7ec137-20d8-418a-a85e-70034882f17b-console-serving-cert\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.350749 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a7ec137-20d8-418a-a85e-70034882f17b-console-config\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.350825 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a7ec137-20d8-418a-a85e-70034882f17b-trusted-ca-bundle\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.350875 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a7ec137-20d8-418a-a85e-70034882f17b-oauth-serving-cert\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.452393 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a7ec137-20d8-418a-a85e-70034882f17b-console-oauth-config\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.452453 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a7ec137-20d8-418a-a85e-70034882f17b-service-ca\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.452489 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpd9s\" (UniqueName: \"kubernetes.io/projected/5a7ec137-20d8-418a-a85e-70034882f17b-kube-api-access-vpd9s\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.452512 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a7ec137-20d8-418a-a85e-70034882f17b-console-serving-cert\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.452536 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a7ec137-20d8-418a-a85e-70034882f17b-console-config\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.452555 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a7ec137-20d8-418a-a85e-70034882f17b-trusted-ca-bundle\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.452587 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a7ec137-20d8-418a-a85e-70034882f17b-oauth-serving-cert\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.453712 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a7ec137-20d8-418a-a85e-70034882f17b-oauth-serving-cert\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.453778 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a7ec137-20d8-418a-a85e-70034882f17b-console-config\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.453863 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a7ec137-20d8-418a-a85e-70034882f17b-service-ca\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.454384 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a7ec137-20d8-418a-a85e-70034882f17b-trusted-ca-bundle\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.457397 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a7ec137-20d8-418a-a85e-70034882f17b-console-oauth-config\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.458534 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a7ec137-20d8-418a-a85e-70034882f17b-console-serving-cert\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.472568 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpd9s\" (UniqueName: \"kubernetes.io/projected/5a7ec137-20d8-418a-a85e-70034882f17b-kube-api-access-vpd9s\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.529801 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.553975 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0b06c77a-f41d-41a6-b115-f12cc5109c0c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-6cx5h\" (UID: \"0b06c77a-f41d-41a6-b115-f12cc5109c0c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.557449 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0b06c77a-f41d-41a6-b115-f12cc5109c0c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-6cx5h\" (UID: \"0b06c77a-f41d-41a6-b115-f12cc5109c0c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.605714 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-8xzdk"] Mar 13 12:00:47 crc kubenswrapper[4837]: W0313 12:00:47.624945 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d1f2d02_86ab_4679_a4e4_530ad37e4302.slice/crio-a45c2b0e8e0bbe0b8df88fc14927993ad8f2a61aeb5953dd1784c7e56716e258 WatchSource:0}: Error finding container a45c2b0e8e0bbe0b8df88fc14927993ad8f2a61aeb5953dd1784c7e56716e258: Status 404 returned error can't find the container with id a45c2b0e8e0bbe0b8df88fc14927993ad8f2a61aeb5953dd1784c7e56716e258 Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.699521 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-854454756c-m4vqj"] Mar 13 12:00:47 crc kubenswrapper[4837]: W0313 12:00:47.703350 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a7ec137_20d8_418a_a85e_70034882f17b.slice/crio-9ca0676c9684bc9fdaf0b00e8c14402e7bc7de2ba404993ecda5a6ee276442b9 WatchSource:0}: Error finding container 9ca0676c9684bc9fdaf0b00e8c14402e7bc7de2ba404993ecda5a6ee276442b9: Status 404 returned error can't find the container with id 9ca0676c9684bc9fdaf0b00e8c14402e7bc7de2ba404993ecda5a6ee276442b9 Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.728314 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr"] Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.733418 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-854454756c-m4vqj" event={"ID":"5a7ec137-20d8-418a-a85e-70034882f17b","Type":"ContainerStarted","Data":"9ca0676c9684bc9fdaf0b00e8c14402e7bc7de2ba404993ecda5a6ee276442b9"} Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.734478 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-vqqqz" event={"ID":"ebe31727-805d-472e-89d3-e99b11435be1","Type":"ContainerStarted","Data":"60fa02310af7adab0a694c629c6c25df0f989119c50ec725c46f2c14e712994f"} Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.735772 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8xzdk" event={"ID":"5d1f2d02-86ab-4679-a4e4-530ad37e4302","Type":"ContainerStarted","Data":"a45c2b0e8e0bbe0b8df88fc14927993ad8f2a61aeb5953dd1784c7e56716e258"} Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.807094 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" Mar 13 12:00:48 crc kubenswrapper[4837]: I0313 12:00:48.006836 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h"] Mar 13 12:00:48 crc kubenswrapper[4837]: I0313 12:00:48.742655 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" event={"ID":"00b31b3f-b520-493a-ad26-679e09376e81","Type":"ContainerStarted","Data":"dcbed1a67c9dc1d66a41de49083bbe771dddbc7f5a38bbf0ab421de04ecfe33d"} Mar 13 12:00:48 crc kubenswrapper[4837]: I0313 12:00:48.743844 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" event={"ID":"0b06c77a-f41d-41a6-b115-f12cc5109c0c","Type":"ContainerStarted","Data":"ca50d1b1a8f24579c70b00e4de89f7644e846fd9eb1c8e85b4fac31c249d87b8"} Mar 13 12:00:48 crc kubenswrapper[4837]: I0313 12:00:48.745570 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-854454756c-m4vqj" event={"ID":"5a7ec137-20d8-418a-a85e-70034882f17b","Type":"ContainerStarted","Data":"a817a4b9822e441997d9c09c8a2b1479db54f18e4babb28b7cd340fe18b1bf1a"} Mar 13 12:00:48 crc kubenswrapper[4837]: I0313 12:00:48.771612 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-854454756c-m4vqj" podStartSLOduration=1.771594417 podStartE2EDuration="1.771594417s" podCreationTimestamp="2026-03-13 12:00:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:00:48.768818059 +0000 UTC m=+764.407084852" watchObservedRunningTime="2026-03-13 12:00:48.771594417 +0000 UTC m=+764.409861180" Mar 13 12:00:51 crc kubenswrapper[4837]: I0313 12:00:51.766024 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" event={"ID":"0b06c77a-f41d-41a6-b115-f12cc5109c0c","Type":"ContainerStarted","Data":"071ac06be957e77530585aa750a43b9b20e43da820b08a24ce544438007919af"} Mar 13 12:00:51 crc kubenswrapper[4837]: I0313 12:00:51.766698 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" Mar 13 12:00:51 crc kubenswrapper[4837]: I0313 12:00:51.768088 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-vqqqz" event={"ID":"ebe31727-805d-472e-89d3-e99b11435be1","Type":"ContainerStarted","Data":"3d4ed2e89ee1a42b4883155ccb45c1352121d16356c3190690720e7ed39eab4d"} Mar 13 12:00:51 crc kubenswrapper[4837]: I0313 12:00:51.768200 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:51 crc kubenswrapper[4837]: I0313 12:00:51.769877 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" event={"ID":"00b31b3f-b520-493a-ad26-679e09376e81","Type":"ContainerStarted","Data":"98059a45b58908574f3200c3b0d93a3aadd72da2cafdf805fa0b95c154ed526f"} Mar 13 12:00:51 crc kubenswrapper[4837]: I0313 12:00:51.771123 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8xzdk" event={"ID":"5d1f2d02-86ab-4679-a4e4-530ad37e4302","Type":"ContainerStarted","Data":"c3ffde7165f32e645801163b2a292038815fc9c3f316f07353aff84a66ebc113"} Mar 13 12:00:51 crc kubenswrapper[4837]: I0313 12:00:51.786077 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" podStartSLOduration=3.182402109 podStartE2EDuration="5.786055436s" podCreationTimestamp="2026-03-13 12:00:46 +0000 UTC" firstStartedPulling="2026-03-13 12:00:48.014648502 +0000 UTC m=+763.652915265" lastFinishedPulling="2026-03-13 12:00:50.618301829 +0000 UTC m=+766.256568592" observedRunningTime="2026-03-13 12:00:51.784436464 +0000 UTC m=+767.422703227" watchObservedRunningTime="2026-03-13 12:00:51.786055436 +0000 UTC m=+767.424322199" Mar 13 12:00:51 crc kubenswrapper[4837]: I0313 12:00:51.806156 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-vqqqz" podStartSLOduration=2.482703446 podStartE2EDuration="5.806141314s" podCreationTimestamp="2026-03-13 12:00:46 +0000 UTC" firstStartedPulling="2026-03-13 12:00:47.273356865 +0000 UTC m=+762.911623628" lastFinishedPulling="2026-03-13 12:00:50.596794713 +0000 UTC m=+766.235061496" observedRunningTime="2026-03-13 12:00:51.804386189 +0000 UTC m=+767.442652952" watchObservedRunningTime="2026-03-13 12:00:51.806141314 +0000 UTC m=+767.444408077" Mar 13 12:00:51 crc kubenswrapper[4837]: I0313 12:00:51.825554 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" podStartSLOduration=2.971000153 podStartE2EDuration="5.825537392s" podCreationTimestamp="2026-03-13 12:00:46 +0000 UTC" firstStartedPulling="2026-03-13 12:00:47.73961049 +0000 UTC m=+763.377877253" lastFinishedPulling="2026-03-13 12:00:50.594147689 +0000 UTC m=+766.232414492" observedRunningTime="2026-03-13 12:00:51.821484553 +0000 UTC m=+767.459751326" watchObservedRunningTime="2026-03-13 12:00:51.825537392 +0000 UTC m=+767.463804155" Mar 13 12:00:53 crc kubenswrapper[4837]: I0313 12:00:53.783631 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8xzdk" event={"ID":"5d1f2d02-86ab-4679-a4e4-530ad37e4302","Type":"ContainerStarted","Data":"77dff13be078c7601290f9b90ec50db8b6bef2a40f9e3f30dbb760d47ce80e19"} Mar 13 12:00:53 crc kubenswrapper[4837]: I0313 12:00:53.805408 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8xzdk" podStartSLOduration=2.266604648 podStartE2EDuration="7.805390129s" podCreationTimestamp="2026-03-13 12:00:46 +0000 UTC" firstStartedPulling="2026-03-13 12:00:47.626591154 +0000 UTC m=+763.264857917" lastFinishedPulling="2026-03-13 12:00:53.165376635 +0000 UTC m=+768.803643398" observedRunningTime="2026-03-13 12:00:53.802676613 +0000 UTC m=+769.440943376" watchObservedRunningTime="2026-03-13 12:00:53.805390129 +0000 UTC m=+769.443656892" Mar 13 12:00:57 crc kubenswrapper[4837]: I0313 12:00:57.259959 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:57 crc kubenswrapper[4837]: I0313 12:00:57.530482 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:57 crc kubenswrapper[4837]: I0313 12:00:57.530555 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:57 crc kubenswrapper[4837]: I0313 12:00:57.536551 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:57 crc kubenswrapper[4837]: I0313 12:00:57.820092 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:57 crc kubenswrapper[4837]: I0313 12:00:57.873942 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-q2qpt"] Mar 13 12:01:05 crc kubenswrapper[4837]: I0313 12:01:05.484417 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:01:05 crc kubenswrapper[4837]: I0313 12:01:05.484969 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:01:07 crc kubenswrapper[4837]: I0313 12:01:07.816413 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.539574 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf"] Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.541740 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.544174 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.553063 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf"] Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.702997 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdwzz\" (UniqueName: \"kubernetes.io/projected/b1863878-b849-4485-9e78-35c9f9856697-kube-api-access-kdwzz\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf\" (UID: \"b1863878-b849-4485-9e78-35c9f9856697\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.703087 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1863878-b849-4485-9e78-35c9f9856697-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf\" (UID: \"b1863878-b849-4485-9e78-35c9f9856697\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.703318 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1863878-b849-4485-9e78-35c9f9856697-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf\" (UID: \"b1863878-b849-4485-9e78-35c9f9856697\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.804101 4837 scope.go:117] "RemoveContainer" containerID="35377d4210b529c8401b806fa107dba5beb6002cbc3a3ce3ea9ad22bd10d0960" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.805114 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdwzz\" (UniqueName: \"kubernetes.io/projected/b1863878-b849-4485-9e78-35c9f9856697-kube-api-access-kdwzz\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf\" (UID: \"b1863878-b849-4485-9e78-35c9f9856697\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.805225 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1863878-b849-4485-9e78-35c9f9856697-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf\" (UID: \"b1863878-b849-4485-9e78-35c9f9856697\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.805325 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1863878-b849-4485-9e78-35c9f9856697-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf\" (UID: \"b1863878-b849-4485-9e78-35c9f9856697\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.806251 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1863878-b849-4485-9e78-35c9f9856697-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf\" (UID: \"b1863878-b849-4485-9e78-35c9f9856697\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.806740 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1863878-b849-4485-9e78-35c9f9856697-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf\" (UID: \"b1863878-b849-4485-9e78-35c9f9856697\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.833228 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdwzz\" (UniqueName: \"kubernetes.io/projected/b1863878-b849-4485-9e78-35c9f9856697-kube-api-access-kdwzz\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf\" (UID: \"b1863878-b849-4485-9e78-35c9f9856697\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.871801 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:22 crc kubenswrapper[4837]: I0313 12:01:22.071228 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf"] Mar 13 12:01:22 crc kubenswrapper[4837]: I0313 12:01:22.112370 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" event={"ID":"b1863878-b849-4485-9e78-35c9f9856697","Type":"ContainerStarted","Data":"39ce74619aec31a4e35d3a5468f0f1734a404c9cf2b1ede413f01a38b7ff24cd"} Mar 13 12:01:22 crc kubenswrapper[4837]: I0313 12:01:22.936516 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-q2qpt" podUID="c83842ec-9933-4f84-bb4a-c84ca61a28e1" containerName="console" containerID="cri-o://c3e3e9b2ed47e2f7480af78d679ab1d816ea01c193c35244aa52793e0f02f112" gracePeriod=15 Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.121987 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-q2qpt_c83842ec-9933-4f84-bb4a-c84ca61a28e1/console/0.log" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.122263 4837 generic.go:334] "Generic (PLEG): container finished" podID="c83842ec-9933-4f84-bb4a-c84ca61a28e1" containerID="c3e3e9b2ed47e2f7480af78d679ab1d816ea01c193c35244aa52793e0f02f112" exitCode=2 Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.122363 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q2qpt" event={"ID":"c83842ec-9933-4f84-bb4a-c84ca61a28e1","Type":"ContainerDied","Data":"c3e3e9b2ed47e2f7480af78d679ab1d816ea01c193c35244aa52793e0f02f112"} Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.126911 4837 generic.go:334] "Generic (PLEG): container finished" podID="b1863878-b849-4485-9e78-35c9f9856697" containerID="d70d2a6694e0790bf492b578e3f25e7018641768f6c2060e8273ce8eba4c9dad" exitCode=0 Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.126942 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" event={"ID":"b1863878-b849-4485-9e78-35c9f9856697","Type":"ContainerDied","Data":"d70d2a6694e0790bf492b578e3f25e7018641768f6c2060e8273ce8eba4c9dad"} Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.283127 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-q2qpt_c83842ec-9933-4f84-bb4a-c84ca61a28e1/console/0.log" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.283227 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.425922 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-trusted-ca-bundle\") pod \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.426065 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-serving-cert\") pod \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.426144 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpjrd\" (UniqueName: \"kubernetes.io/projected/c83842ec-9933-4f84-bb4a-c84ca61a28e1-kube-api-access-jpjrd\") pod \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.426268 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-config\") pod \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.426357 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-oauth-config\") pod \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.426421 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-service-ca\") pod \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.426475 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-oauth-serving-cert\") pod \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.427314 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-config" (OuterVolumeSpecName: "console-config") pod "c83842ec-9933-4f84-bb4a-c84ca61a28e1" (UID: "c83842ec-9933-4f84-bb4a-c84ca61a28e1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.427626 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c83842ec-9933-4f84-bb4a-c84ca61a28e1" (UID: "c83842ec-9933-4f84-bb4a-c84ca61a28e1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.427720 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-service-ca" (OuterVolumeSpecName: "service-ca") pod "c83842ec-9933-4f84-bb4a-c84ca61a28e1" (UID: "c83842ec-9933-4f84-bb4a-c84ca61a28e1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.427800 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c83842ec-9933-4f84-bb4a-c84ca61a28e1" (UID: "c83842ec-9933-4f84-bb4a-c84ca61a28e1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.433576 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c83842ec-9933-4f84-bb4a-c84ca61a28e1-kube-api-access-jpjrd" (OuterVolumeSpecName: "kube-api-access-jpjrd") pod "c83842ec-9933-4f84-bb4a-c84ca61a28e1" (UID: "c83842ec-9933-4f84-bb4a-c84ca61a28e1"). InnerVolumeSpecName "kube-api-access-jpjrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.438854 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c83842ec-9933-4f84-bb4a-c84ca61a28e1" (UID: "c83842ec-9933-4f84-bb4a-c84ca61a28e1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.439147 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c83842ec-9933-4f84-bb4a-c84ca61a28e1" (UID: "c83842ec-9933-4f84-bb4a-c84ca61a28e1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.528486 4837 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.528537 4837 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.528559 4837 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.528576 4837 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.528595 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.528612 4837 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.528629 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpjrd\" (UniqueName: \"kubernetes.io/projected/c83842ec-9933-4f84-bb4a-c84ca61a28e1-kube-api-access-jpjrd\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:24 crc kubenswrapper[4837]: I0313 12:01:24.133931 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-q2qpt_c83842ec-9933-4f84-bb4a-c84ca61a28e1/console/0.log" Mar 13 12:01:24 crc kubenswrapper[4837]: I0313 12:01:24.134049 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 12:01:24 crc kubenswrapper[4837]: I0313 12:01:24.134126 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q2qpt" event={"ID":"c83842ec-9933-4f84-bb4a-c84ca61a28e1","Type":"ContainerDied","Data":"6d6886f8a08a9d6498bf2731a6faf601bf8b43c566b4a0dbe066c5557e5e15e0"} Mar 13 12:01:24 crc kubenswrapper[4837]: I0313 12:01:24.134196 4837 scope.go:117] "RemoveContainer" containerID="c3e3e9b2ed47e2f7480af78d679ab1d816ea01c193c35244aa52793e0f02f112" Mar 13 12:01:24 crc kubenswrapper[4837]: I0313 12:01:24.168352 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-q2qpt"] Mar 13 12:01:24 crc kubenswrapper[4837]: I0313 12:01:24.173705 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-q2qpt"] Mar 13 12:01:25 crc kubenswrapper[4837]: I0313 12:01:25.055913 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c83842ec-9933-4f84-bb4a-c84ca61a28e1" path="/var/lib/kubelet/pods/c83842ec-9933-4f84-bb4a-c84ca61a28e1/volumes" Mar 13 12:01:25 crc kubenswrapper[4837]: I0313 12:01:25.143418 4837 generic.go:334] "Generic (PLEG): container finished" podID="b1863878-b849-4485-9e78-35c9f9856697" containerID="8bd41bde4a756b05989d08b252d536c3fed0fbc6582602087dd30edfc3ffcfcd" exitCode=0 Mar 13 12:01:25 crc kubenswrapper[4837]: I0313 12:01:25.143471 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" event={"ID":"b1863878-b849-4485-9e78-35c9f9856697","Type":"ContainerDied","Data":"8bd41bde4a756b05989d08b252d536c3fed0fbc6582602087dd30edfc3ffcfcd"} Mar 13 12:01:26 crc kubenswrapper[4837]: I0313 12:01:26.154883 4837 generic.go:334] "Generic (PLEG): container finished" podID="b1863878-b849-4485-9e78-35c9f9856697" containerID="b29e084b12ffcf45b76e91c7e0adf96d7385273533a2a302f84e65b130630738" exitCode=0 Mar 13 12:01:26 crc kubenswrapper[4837]: I0313 12:01:26.154940 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" event={"ID":"b1863878-b849-4485-9e78-35c9f9856697","Type":"ContainerDied","Data":"b29e084b12ffcf45b76e91c7e0adf96d7385273533a2a302f84e65b130630738"} Mar 13 12:01:27 crc kubenswrapper[4837]: I0313 12:01:27.378786 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:27 crc kubenswrapper[4837]: I0313 12:01:27.577396 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1863878-b849-4485-9e78-35c9f9856697-util\") pod \"b1863878-b849-4485-9e78-35c9f9856697\" (UID: \"b1863878-b849-4485-9e78-35c9f9856697\") " Mar 13 12:01:27 crc kubenswrapper[4837]: I0313 12:01:27.577468 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1863878-b849-4485-9e78-35c9f9856697-bundle\") pod \"b1863878-b849-4485-9e78-35c9f9856697\" (UID: \"b1863878-b849-4485-9e78-35c9f9856697\") " Mar 13 12:01:27 crc kubenswrapper[4837]: I0313 12:01:27.577496 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdwzz\" (UniqueName: \"kubernetes.io/projected/b1863878-b849-4485-9e78-35c9f9856697-kube-api-access-kdwzz\") pod \"b1863878-b849-4485-9e78-35c9f9856697\" (UID: \"b1863878-b849-4485-9e78-35c9f9856697\") " Mar 13 12:01:27 crc kubenswrapper[4837]: I0313 12:01:27.578846 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1863878-b849-4485-9e78-35c9f9856697-bundle" (OuterVolumeSpecName: "bundle") pod "b1863878-b849-4485-9e78-35c9f9856697" (UID: "b1863878-b849-4485-9e78-35c9f9856697"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:01:27 crc kubenswrapper[4837]: I0313 12:01:27.583130 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1863878-b849-4485-9e78-35c9f9856697-kube-api-access-kdwzz" (OuterVolumeSpecName: "kube-api-access-kdwzz") pod "b1863878-b849-4485-9e78-35c9f9856697" (UID: "b1863878-b849-4485-9e78-35c9f9856697"). InnerVolumeSpecName "kube-api-access-kdwzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:01:27 crc kubenswrapper[4837]: I0313 12:01:27.678940 4837 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1863878-b849-4485-9e78-35c9f9856697-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:27 crc kubenswrapper[4837]: I0313 12:01:27.678981 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdwzz\" (UniqueName: \"kubernetes.io/projected/b1863878-b849-4485-9e78-35c9f9856697-kube-api-access-kdwzz\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:27 crc kubenswrapper[4837]: I0313 12:01:27.727059 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1863878-b849-4485-9e78-35c9f9856697-util" (OuterVolumeSpecName: "util") pod "b1863878-b849-4485-9e78-35c9f9856697" (UID: "b1863878-b849-4485-9e78-35c9f9856697"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:01:27 crc kubenswrapper[4837]: I0313 12:01:27.780258 4837 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1863878-b849-4485-9e78-35c9f9856697-util\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:28 crc kubenswrapper[4837]: I0313 12:01:28.166267 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" event={"ID":"b1863878-b849-4485-9e78-35c9f9856697","Type":"ContainerDied","Data":"39ce74619aec31a4e35d3a5468f0f1734a404c9cf2b1ede413f01a38b7ff24cd"} Mar 13 12:01:28 crc kubenswrapper[4837]: I0313 12:01:28.166302 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39ce74619aec31a4e35d3a5468f0f1734a404c9cf2b1ede413f01a38b7ff24cd" Mar 13 12:01:28 crc kubenswrapper[4837]: I0313 12:01:28.166308 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:35 crc kubenswrapper[4837]: I0313 12:01:35.484100 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:01:35 crc kubenswrapper[4837]: I0313 12:01:35.484444 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.293989 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d"] Mar 13 12:01:42 crc kubenswrapper[4837]: E0313 12:01:42.294855 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1863878-b849-4485-9e78-35c9f9856697" containerName="pull" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.294871 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1863878-b849-4485-9e78-35c9f9856697" containerName="pull" Mar 13 12:01:42 crc kubenswrapper[4837]: E0313 12:01:42.294919 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1863878-b849-4485-9e78-35c9f9856697" containerName="extract" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.294925 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1863878-b849-4485-9e78-35c9f9856697" containerName="extract" Mar 13 12:01:42 crc kubenswrapper[4837]: E0313 12:01:42.294935 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83842ec-9933-4f84-bb4a-c84ca61a28e1" containerName="console" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.294941 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83842ec-9933-4f84-bb4a-c84ca61a28e1" containerName="console" Mar 13 12:01:42 crc kubenswrapper[4837]: E0313 12:01:42.294954 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1863878-b849-4485-9e78-35c9f9856697" containerName="util" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.294959 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1863878-b849-4485-9e78-35c9f9856697" containerName="util" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.295070 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83842ec-9933-4f84-bb4a-c84ca61a28e1" containerName="console" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.295083 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1863878-b849-4485-9e78-35c9f9856697" containerName="extract" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.295504 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.297202 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.297546 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.298191 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.298548 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.298748 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-c94w4" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.308581 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d"] Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.366711 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq8fd\" (UniqueName: \"kubernetes.io/projected/41898fd8-d078-444c-bb55-33f4fb6f3dcc-kube-api-access-cq8fd\") pod \"metallb-operator-controller-manager-dcfbdf95f-7x96d\" (UID: \"41898fd8-d078-444c-bb55-33f4fb6f3dcc\") " pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.366859 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41898fd8-d078-444c-bb55-33f4fb6f3dcc-webhook-cert\") pod \"metallb-operator-controller-manager-dcfbdf95f-7x96d\" (UID: \"41898fd8-d078-444c-bb55-33f4fb6f3dcc\") " pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.367014 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41898fd8-d078-444c-bb55-33f4fb6f3dcc-apiservice-cert\") pod \"metallb-operator-controller-manager-dcfbdf95f-7x96d\" (UID: \"41898fd8-d078-444c-bb55-33f4fb6f3dcc\") " pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.467683 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41898fd8-d078-444c-bb55-33f4fb6f3dcc-apiservice-cert\") pod \"metallb-operator-controller-manager-dcfbdf95f-7x96d\" (UID: \"41898fd8-d078-444c-bb55-33f4fb6f3dcc\") " pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.467744 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq8fd\" (UniqueName: \"kubernetes.io/projected/41898fd8-d078-444c-bb55-33f4fb6f3dcc-kube-api-access-cq8fd\") pod \"metallb-operator-controller-manager-dcfbdf95f-7x96d\" (UID: \"41898fd8-d078-444c-bb55-33f4fb6f3dcc\") " pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.467777 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41898fd8-d078-444c-bb55-33f4fb6f3dcc-webhook-cert\") pod \"metallb-operator-controller-manager-dcfbdf95f-7x96d\" (UID: \"41898fd8-d078-444c-bb55-33f4fb6f3dcc\") " pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.474240 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41898fd8-d078-444c-bb55-33f4fb6f3dcc-apiservice-cert\") pod \"metallb-operator-controller-manager-dcfbdf95f-7x96d\" (UID: \"41898fd8-d078-444c-bb55-33f4fb6f3dcc\") " pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.476756 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41898fd8-d078-444c-bb55-33f4fb6f3dcc-webhook-cert\") pod \"metallb-operator-controller-manager-dcfbdf95f-7x96d\" (UID: \"41898fd8-d078-444c-bb55-33f4fb6f3dcc\") " pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.498490 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq8fd\" (UniqueName: \"kubernetes.io/projected/41898fd8-d078-444c-bb55-33f4fb6f3dcc-kube-api-access-cq8fd\") pod \"metallb-operator-controller-manager-dcfbdf95f-7x96d\" (UID: \"41898fd8-d078-444c-bb55-33f4fb6f3dcc\") " pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.546295 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm"] Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.547169 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.548827 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.549185 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-7tcx9" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.550095 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.568554 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm"] Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.613087 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.669411 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eabfad13-4fe4-495d-8b6a-2da56ef3b826-apiservice-cert\") pod \"metallb-operator-webhook-server-59b847b88-lrvzm\" (UID: \"eabfad13-4fe4-495d-8b6a-2da56ef3b826\") " pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.669480 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79kfq\" (UniqueName: \"kubernetes.io/projected/eabfad13-4fe4-495d-8b6a-2da56ef3b826-kube-api-access-79kfq\") pod \"metallb-operator-webhook-server-59b847b88-lrvzm\" (UID: \"eabfad13-4fe4-495d-8b6a-2da56ef3b826\") " pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.669613 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eabfad13-4fe4-495d-8b6a-2da56ef3b826-webhook-cert\") pod \"metallb-operator-webhook-server-59b847b88-lrvzm\" (UID: \"eabfad13-4fe4-495d-8b6a-2da56ef3b826\") " pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.770326 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eabfad13-4fe4-495d-8b6a-2da56ef3b826-webhook-cert\") pod \"metallb-operator-webhook-server-59b847b88-lrvzm\" (UID: \"eabfad13-4fe4-495d-8b6a-2da56ef3b826\") " pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.770375 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eabfad13-4fe4-495d-8b6a-2da56ef3b826-apiservice-cert\") pod \"metallb-operator-webhook-server-59b847b88-lrvzm\" (UID: \"eabfad13-4fe4-495d-8b6a-2da56ef3b826\") " pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.770420 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79kfq\" (UniqueName: \"kubernetes.io/projected/eabfad13-4fe4-495d-8b6a-2da56ef3b826-kube-api-access-79kfq\") pod \"metallb-operator-webhook-server-59b847b88-lrvzm\" (UID: \"eabfad13-4fe4-495d-8b6a-2da56ef3b826\") " pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.788884 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79kfq\" (UniqueName: \"kubernetes.io/projected/eabfad13-4fe4-495d-8b6a-2da56ef3b826-kube-api-access-79kfq\") pod \"metallb-operator-webhook-server-59b847b88-lrvzm\" (UID: \"eabfad13-4fe4-495d-8b6a-2da56ef3b826\") " pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.789863 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eabfad13-4fe4-495d-8b6a-2da56ef3b826-webhook-cert\") pod \"metallb-operator-webhook-server-59b847b88-lrvzm\" (UID: \"eabfad13-4fe4-495d-8b6a-2da56ef3b826\") " pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.790757 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eabfad13-4fe4-495d-8b6a-2da56ef3b826-apiservice-cert\") pod \"metallb-operator-webhook-server-59b847b88-lrvzm\" (UID: \"eabfad13-4fe4-495d-8b6a-2da56ef3b826\") " pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.834530 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d"] Mar 13 12:01:42 crc kubenswrapper[4837]: W0313 12:01:42.843955 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41898fd8_d078_444c_bb55_33f4fb6f3dcc.slice/crio-556281005f3a1cf373a5555b2550b92e059922df58c9f858f29a146f510ef88b WatchSource:0}: Error finding container 556281005f3a1cf373a5555b2550b92e059922df58c9f858f29a146f510ef88b: Status 404 returned error can't find the container with id 556281005f3a1cf373a5555b2550b92e059922df58c9f858f29a146f510ef88b Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.861921 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:01:43 crc kubenswrapper[4837]: I0313 12:01:43.166204 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm"] Mar 13 12:01:43 crc kubenswrapper[4837]: W0313 12:01:43.172126 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeabfad13_4fe4_495d_8b6a_2da56ef3b826.slice/crio-d8bdc8d8711afb68826ef12318f9938164f8d7d30cc7cb4e21f4a38ce87b3fbd WatchSource:0}: Error finding container d8bdc8d8711afb68826ef12318f9938164f8d7d30cc7cb4e21f4a38ce87b3fbd: Status 404 returned error can't find the container with id d8bdc8d8711afb68826ef12318f9938164f8d7d30cc7cb4e21f4a38ce87b3fbd Mar 13 12:01:43 crc kubenswrapper[4837]: I0313 12:01:43.249274 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" event={"ID":"41898fd8-d078-444c-bb55-33f4fb6f3dcc","Type":"ContainerStarted","Data":"556281005f3a1cf373a5555b2550b92e059922df58c9f858f29a146f510ef88b"} Mar 13 12:01:43 crc kubenswrapper[4837]: I0313 12:01:43.250837 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" event={"ID":"eabfad13-4fe4-495d-8b6a-2da56ef3b826","Type":"ContainerStarted","Data":"d8bdc8d8711afb68826ef12318f9938164f8d7d30cc7cb4e21f4a38ce87b3fbd"} Mar 13 12:01:43 crc kubenswrapper[4837]: I0313 12:01:43.543106 4837 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 12:01:48 crc kubenswrapper[4837]: I0313 12:01:48.277700 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" event={"ID":"eabfad13-4fe4-495d-8b6a-2da56ef3b826","Type":"ContainerStarted","Data":"5867f75d893ec94df4853f1b22129c251b87de3e8df0d6428afe29859087ae4d"} Mar 13 12:01:48 crc kubenswrapper[4837]: I0313 12:01:48.278303 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:01:48 crc kubenswrapper[4837]: I0313 12:01:48.279495 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" event={"ID":"41898fd8-d078-444c-bb55-33f4fb6f3dcc","Type":"ContainerStarted","Data":"6b34c95f375b9e255d75b6073d42cd050581e42da8a9db66004cbbfaa97b1979"} Mar 13 12:01:48 crc kubenswrapper[4837]: I0313 12:01:48.279668 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:01:48 crc kubenswrapper[4837]: I0313 12:01:48.302299 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" podStartSLOduration=1.7485411819999999 podStartE2EDuration="6.30227717s" podCreationTimestamp="2026-03-13 12:01:42 +0000 UTC" firstStartedPulling="2026-03-13 12:01:43.176392445 +0000 UTC m=+818.814659218" lastFinishedPulling="2026-03-13 12:01:47.730128433 +0000 UTC m=+823.368395206" observedRunningTime="2026-03-13 12:01:48.297896902 +0000 UTC m=+823.936163665" watchObservedRunningTime="2026-03-13 12:01:48.30227717 +0000 UTC m=+823.940543933" Mar 13 12:01:48 crc kubenswrapper[4837]: I0313 12:01:48.318925 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" podStartSLOduration=1.440459591 podStartE2EDuration="6.318872886s" podCreationTimestamp="2026-03-13 12:01:42 +0000 UTC" firstStartedPulling="2026-03-13 12:01:42.845338776 +0000 UTC m=+818.483605539" lastFinishedPulling="2026-03-13 12:01:47.723752061 +0000 UTC m=+823.362018834" observedRunningTime="2026-03-13 12:01:48.318770453 +0000 UTC m=+823.957037216" watchObservedRunningTime="2026-03-13 12:01:48.318872886 +0000 UTC m=+823.957139669" Mar 13 12:02:00 crc kubenswrapper[4837]: I0313 12:02:00.132800 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556722-h599x"] Mar 13 12:02:00 crc kubenswrapper[4837]: I0313 12:02:00.134005 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556722-h599x" Mar 13 12:02:00 crc kubenswrapper[4837]: I0313 12:02:00.136065 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:02:00 crc kubenswrapper[4837]: I0313 12:02:00.138272 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:02:00 crc kubenswrapper[4837]: I0313 12:02:00.138276 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:02:00 crc kubenswrapper[4837]: I0313 12:02:00.145225 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556722-h599x"] Mar 13 12:02:00 crc kubenswrapper[4837]: I0313 12:02:00.314626 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj6p2\" (UniqueName: \"kubernetes.io/projected/be033789-27be-444d-b72e-7abbbb34b285-kube-api-access-jj6p2\") pod \"auto-csr-approver-29556722-h599x\" (UID: \"be033789-27be-444d-b72e-7abbbb34b285\") " pod="openshift-infra/auto-csr-approver-29556722-h599x" Mar 13 12:02:00 crc kubenswrapper[4837]: I0313 12:02:00.416077 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj6p2\" (UniqueName: \"kubernetes.io/projected/be033789-27be-444d-b72e-7abbbb34b285-kube-api-access-jj6p2\") pod \"auto-csr-approver-29556722-h599x\" (UID: \"be033789-27be-444d-b72e-7abbbb34b285\") " pod="openshift-infra/auto-csr-approver-29556722-h599x" Mar 13 12:02:00 crc kubenswrapper[4837]: I0313 12:02:00.454178 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj6p2\" (UniqueName: \"kubernetes.io/projected/be033789-27be-444d-b72e-7abbbb34b285-kube-api-access-jj6p2\") pod \"auto-csr-approver-29556722-h599x\" (UID: \"be033789-27be-444d-b72e-7abbbb34b285\") " pod="openshift-infra/auto-csr-approver-29556722-h599x" Mar 13 12:02:00 crc kubenswrapper[4837]: I0313 12:02:00.751873 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556722-h599x" Mar 13 12:02:01 crc kubenswrapper[4837]: I0313 12:02:01.007346 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556722-h599x"] Mar 13 12:02:01 crc kubenswrapper[4837]: I0313 12:02:01.356053 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556722-h599x" event={"ID":"be033789-27be-444d-b72e-7abbbb34b285","Type":"ContainerStarted","Data":"c15701a0f861e4e4c3217a9bebcd0ebde36dbbecdb674e106e7fb3aae44db1c2"} Mar 13 12:02:02 crc kubenswrapper[4837]: I0313 12:02:02.872364 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:02:04 crc kubenswrapper[4837]: I0313 12:02:04.375358 4837 generic.go:334] "Generic (PLEG): container finished" podID="be033789-27be-444d-b72e-7abbbb34b285" containerID="bf1679f5dae4d4dbf23dda0605e595646a6c9aa5a55d2f380823eb7ec590b836" exitCode=0 Mar 13 12:02:04 crc kubenswrapper[4837]: I0313 12:02:04.375458 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556722-h599x" event={"ID":"be033789-27be-444d-b72e-7abbbb34b285","Type":"ContainerDied","Data":"bf1679f5dae4d4dbf23dda0605e595646a6c9aa5a55d2f380823eb7ec590b836"} Mar 13 12:02:05 crc kubenswrapper[4837]: I0313 12:02:05.484029 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:02:05 crc kubenswrapper[4837]: I0313 12:02:05.484356 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:02:05 crc kubenswrapper[4837]: I0313 12:02:05.484459 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 12:02:05 crc kubenswrapper[4837]: I0313 12:02:05.485430 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"86010f8ae6e03e22840b0db405e4816a52e1a80af0eff6188dd5d3d81e63937a"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:02:05 crc kubenswrapper[4837]: I0313 12:02:05.485476 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://86010f8ae6e03e22840b0db405e4816a52e1a80af0eff6188dd5d3d81e63937a" gracePeriod=600 Mar 13 12:02:05 crc kubenswrapper[4837]: I0313 12:02:05.622381 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556722-h599x" Mar 13 12:02:05 crc kubenswrapper[4837]: I0313 12:02:05.791469 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj6p2\" (UniqueName: \"kubernetes.io/projected/be033789-27be-444d-b72e-7abbbb34b285-kube-api-access-jj6p2\") pod \"be033789-27be-444d-b72e-7abbbb34b285\" (UID: \"be033789-27be-444d-b72e-7abbbb34b285\") " Mar 13 12:02:05 crc kubenswrapper[4837]: I0313 12:02:05.799558 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be033789-27be-444d-b72e-7abbbb34b285-kube-api-access-jj6p2" (OuterVolumeSpecName: "kube-api-access-jj6p2") pod "be033789-27be-444d-b72e-7abbbb34b285" (UID: "be033789-27be-444d-b72e-7abbbb34b285"). InnerVolumeSpecName "kube-api-access-jj6p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:02:05 crc kubenswrapper[4837]: I0313 12:02:05.893590 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj6p2\" (UniqueName: \"kubernetes.io/projected/be033789-27be-444d-b72e-7abbbb34b285-kube-api-access-jj6p2\") on node \"crc\" DevicePath \"\"" Mar 13 12:02:06 crc kubenswrapper[4837]: I0313 12:02:06.389245 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556722-h599x" event={"ID":"be033789-27be-444d-b72e-7abbbb34b285","Type":"ContainerDied","Data":"c15701a0f861e4e4c3217a9bebcd0ebde36dbbecdb674e106e7fb3aae44db1c2"} Mar 13 12:02:06 crc kubenswrapper[4837]: I0313 12:02:06.389749 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c15701a0f861e4e4c3217a9bebcd0ebde36dbbecdb674e106e7fb3aae44db1c2" Mar 13 12:02:06 crc kubenswrapper[4837]: I0313 12:02:06.389277 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556722-h599x" Mar 13 12:02:06 crc kubenswrapper[4837]: I0313 12:02:06.391572 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="86010f8ae6e03e22840b0db405e4816a52e1a80af0eff6188dd5d3d81e63937a" exitCode=0 Mar 13 12:02:06 crc kubenswrapper[4837]: I0313 12:02:06.391615 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"86010f8ae6e03e22840b0db405e4816a52e1a80af0eff6188dd5d3d81e63937a"} Mar 13 12:02:06 crc kubenswrapper[4837]: I0313 12:02:06.391669 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"62df99fa64e257c350cea1390039e0bd2f2c672bf6d80836ec3df94beec3d8d1"} Mar 13 12:02:06 crc kubenswrapper[4837]: I0313 12:02:06.391690 4837 scope.go:117] "RemoveContainer" containerID="2bc74d238c7c3c8f94dfb05f2715b04d643751479532bb38893e7ef8db5a10d2" Mar 13 12:02:06 crc kubenswrapper[4837]: I0313 12:02:06.674907 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556716-csq4j"] Mar 13 12:02:06 crc kubenswrapper[4837]: I0313 12:02:06.678591 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556716-csq4j"] Mar 13 12:02:07 crc kubenswrapper[4837]: I0313 12:02:07.055026 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a7b275e-9d21-4da0-8bb8-0fee8434ce82" path="/var/lib/kubelet/pods/0a7b275e-9d21-4da0-8bb8-0fee8434ce82/volumes" Mar 13 12:02:21 crc kubenswrapper[4837]: I0313 12:02:21.867148 4837 scope.go:117] "RemoveContainer" containerID="b8629809cebf6aa743a349229b16e8ffb9aaa032ac5c2d5f39b44ba6478a1a13" Mar 13 12:02:22 crc kubenswrapper[4837]: I0313 12:02:22.616546 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.419937 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-f8m9m"] Mar 13 12:02:23 crc kubenswrapper[4837]: E0313 12:02:23.420734 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be033789-27be-444d-b72e-7abbbb34b285" containerName="oc" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.420751 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="be033789-27be-444d-b72e-7abbbb34b285" containerName="oc" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.420877 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="be033789-27be-444d-b72e-7abbbb34b285" containerName="oc" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.423033 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.426098 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.427154 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-tj5lp" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.430754 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.433784 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7"] Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.438289 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.443756 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.449329 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7wbb\" (UniqueName: \"kubernetes.io/projected/387739fd-caae-44d0-8cbb-50808d69618b-kube-api-access-j7wbb\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.449371 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/387739fd-caae-44d0-8cbb-50808d69618b-frr-conf\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.449400 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/387739fd-caae-44d0-8cbb-50808d69618b-frr-sockets\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.449548 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/387739fd-caae-44d0-8cbb-50808d69618b-reloader\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.449594 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/387739fd-caae-44d0-8cbb-50808d69618b-metrics-certs\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.449626 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/387739fd-caae-44d0-8cbb-50808d69618b-metrics\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.449725 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjwhv\" (UniqueName: \"kubernetes.io/projected/c72405c5-2c81-43f4-93c6-f73f9771be8b-kube-api-access-hjwhv\") pod \"frr-k8s-webhook-server-bcc4b6f68-jwgl7\" (UID: \"c72405c5-2c81-43f4-93c6-f73f9771be8b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.449590 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7"] Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.449764 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/387739fd-caae-44d0-8cbb-50808d69618b-frr-startup\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.449788 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c72405c5-2c81-43f4-93c6-f73f9771be8b-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jwgl7\" (UID: \"c72405c5-2c81-43f4-93c6-f73f9771be8b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.527864 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-8skdh"] Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.528730 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8skdh" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.530567 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.530870 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-4kgm2" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.532299 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.533787 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.543460 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-zm9dj"] Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.544532 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.546853 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550548 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/82a5fe00-90be-47b1-a357-69942f385d4f-memberlist\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550594 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/82a5fe00-90be-47b1-a357-69942f385d4f-metallb-excludel2\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550628 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/387739fd-caae-44d0-8cbb-50808d69618b-reloader\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550682 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/387739fd-caae-44d0-8cbb-50808d69618b-metrics-certs\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550706 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/387739fd-caae-44d0-8cbb-50808d69618b-metrics\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550741 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjwhv\" (UniqueName: \"kubernetes.io/projected/c72405c5-2c81-43f4-93c6-f73f9771be8b-kube-api-access-hjwhv\") pod \"frr-k8s-webhook-server-bcc4b6f68-jwgl7\" (UID: \"c72405c5-2c81-43f4-93c6-f73f9771be8b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550761 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/387739fd-caae-44d0-8cbb-50808d69618b-frr-startup\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550777 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c72405c5-2c81-43f4-93c6-f73f9771be8b-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jwgl7\" (UID: \"c72405c5-2c81-43f4-93c6-f73f9771be8b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550797 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82a5fe00-90be-47b1-a357-69942f385d4f-metrics-certs\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550822 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7wbb\" (UniqueName: \"kubernetes.io/projected/387739fd-caae-44d0-8cbb-50808d69618b-kube-api-access-j7wbb\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550837 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/387739fd-caae-44d0-8cbb-50808d69618b-frr-conf\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550857 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pvhj\" (UniqueName: \"kubernetes.io/projected/82a5fe00-90be-47b1-a357-69942f385d4f-kube-api-access-8pvhj\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550881 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/387739fd-caae-44d0-8cbb-50808d69618b-frr-sockets\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.551376 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/387739fd-caae-44d0-8cbb-50808d69618b-frr-sockets\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: E0313 12:02:23.551612 4837 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.551674 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/387739fd-caae-44d0-8cbb-50808d69618b-metrics\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.551687 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/387739fd-caae-44d0-8cbb-50808d69618b-frr-conf\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: E0313 12:02:23.551781 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/387739fd-caae-44d0-8cbb-50808d69618b-metrics-certs podName:387739fd-caae-44d0-8cbb-50808d69618b nodeName:}" failed. No retries permitted until 2026-03-13 12:02:24.051761748 +0000 UTC m=+859.690028511 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/387739fd-caae-44d0-8cbb-50808d69618b-metrics-certs") pod "frr-k8s-f8m9m" (UID: "387739fd-caae-44d0-8cbb-50808d69618b") : secret "frr-k8s-certs-secret" not found Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.551984 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/387739fd-caae-44d0-8cbb-50808d69618b-reloader\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.552497 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/387739fd-caae-44d0-8cbb-50808d69618b-frr-startup\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.560595 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c72405c5-2c81-43f4-93c6-f73f9771be8b-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jwgl7\" (UID: \"c72405c5-2c81-43f4-93c6-f73f9771be8b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.568966 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7wbb\" (UniqueName: \"kubernetes.io/projected/387739fd-caae-44d0-8cbb-50808d69618b-kube-api-access-j7wbb\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.571365 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-zm9dj"] Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.575084 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjwhv\" (UniqueName: \"kubernetes.io/projected/c72405c5-2c81-43f4-93c6-f73f9771be8b-kube-api-access-hjwhv\") pod \"frr-k8s-webhook-server-bcc4b6f68-jwgl7\" (UID: \"c72405c5-2c81-43f4-93c6-f73f9771be8b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.652359 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82a5fe00-90be-47b1-a357-69942f385d4f-metrics-certs\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.652696 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pvhj\" (UniqueName: \"kubernetes.io/projected/82a5fe00-90be-47b1-a357-69942f385d4f-kube-api-access-8pvhj\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.652835 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/82a5fe00-90be-47b1-a357-69942f385d4f-memberlist\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.652952 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/82a5fe00-90be-47b1-a357-69942f385d4f-metallb-excludel2\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.653072 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88-metrics-certs\") pod \"controller-7bb4cc7c98-zm9dj\" (UID: \"0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88\") " pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.653191 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnk2z\" (UniqueName: \"kubernetes.io/projected/0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88-kube-api-access-bnk2z\") pod \"controller-7bb4cc7c98-zm9dj\" (UID: \"0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88\") " pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:23 crc kubenswrapper[4837]: E0313 12:02:23.652969 4837 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.653312 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88-cert\") pod \"controller-7bb4cc7c98-zm9dj\" (UID: \"0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88\") " pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:23 crc kubenswrapper[4837]: E0313 12:02:23.653389 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82a5fe00-90be-47b1-a357-69942f385d4f-memberlist podName:82a5fe00-90be-47b1-a357-69942f385d4f nodeName:}" failed. No retries permitted until 2026-03-13 12:02:24.153358466 +0000 UTC m=+859.791625249 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/82a5fe00-90be-47b1-a357-69942f385d4f-memberlist") pod "speaker-8skdh" (UID: "82a5fe00-90be-47b1-a357-69942f385d4f") : secret "metallb-memberlist" not found Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.654097 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/82a5fe00-90be-47b1-a357-69942f385d4f-metallb-excludel2\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.656143 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82a5fe00-90be-47b1-a357-69942f385d4f-metrics-certs\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.672456 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pvhj\" (UniqueName: \"kubernetes.io/projected/82a5fe00-90be-47b1-a357-69942f385d4f-kube-api-access-8pvhj\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.753270 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.754364 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88-metrics-certs\") pod \"controller-7bb4cc7c98-zm9dj\" (UID: \"0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88\") " pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.754413 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnk2z\" (UniqueName: \"kubernetes.io/projected/0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88-kube-api-access-bnk2z\") pod \"controller-7bb4cc7c98-zm9dj\" (UID: \"0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88\") " pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.754459 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88-cert\") pod \"controller-7bb4cc7c98-zm9dj\" (UID: \"0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88\") " pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.757436 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88-metrics-certs\") pod \"controller-7bb4cc7c98-zm9dj\" (UID: \"0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88\") " pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.759446 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.769071 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88-cert\") pod \"controller-7bb4cc7c98-zm9dj\" (UID: \"0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88\") " pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.773386 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnk2z\" (UniqueName: \"kubernetes.io/projected/0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88-kube-api-access-bnk2z\") pod \"controller-7bb4cc7c98-zm9dj\" (UID: \"0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88\") " pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.859587 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:24 crc kubenswrapper[4837]: I0313 12:02:24.057383 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/387739fd-caae-44d0-8cbb-50808d69618b-metrics-certs\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:24 crc kubenswrapper[4837]: I0313 12:02:24.061813 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/387739fd-caae-44d0-8cbb-50808d69618b-metrics-certs\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:24 crc kubenswrapper[4837]: I0313 12:02:24.152031 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7"] Mar 13 12:02:24 crc kubenswrapper[4837]: I0313 12:02:24.158175 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/82a5fe00-90be-47b1-a357-69942f385d4f-memberlist\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:24 crc kubenswrapper[4837]: E0313 12:02:24.158426 4837 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 13 12:02:24 crc kubenswrapper[4837]: E0313 12:02:24.158517 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82a5fe00-90be-47b1-a357-69942f385d4f-memberlist podName:82a5fe00-90be-47b1-a357-69942f385d4f nodeName:}" failed. No retries permitted until 2026-03-13 12:02:25.158480496 +0000 UTC m=+860.796747279 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/82a5fe00-90be-47b1-a357-69942f385d4f-memberlist") pod "speaker-8skdh" (UID: "82a5fe00-90be-47b1-a357-69942f385d4f") : secret "metallb-memberlist" not found Mar 13 12:02:24 crc kubenswrapper[4837]: I0313 12:02:24.278700 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-zm9dj"] Mar 13 12:02:24 crc kubenswrapper[4837]: W0313 12:02:24.284989 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ad270d6_2fc1_4ed0_8a87_bef0e59a4c88.slice/crio-457175fc4d232168abb6b2f253ec67d16917fe945801bf05877ce399cde3ce96 WatchSource:0}: Error finding container 457175fc4d232168abb6b2f253ec67d16917fe945801bf05877ce399cde3ce96: Status 404 returned error can't find the container with id 457175fc4d232168abb6b2f253ec67d16917fe945801bf05877ce399cde3ce96 Mar 13 12:02:24 crc kubenswrapper[4837]: I0313 12:02:24.339047 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:24 crc kubenswrapper[4837]: I0313 12:02:24.506809 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-zm9dj" event={"ID":"0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88","Type":"ContainerStarted","Data":"b8bb302cedd72c254f632582758d289ae62e01952ef332c906824ebc90cecb1d"} Mar 13 12:02:24 crc kubenswrapper[4837]: I0313 12:02:24.507142 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-zm9dj" event={"ID":"0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88","Type":"ContainerStarted","Data":"457175fc4d232168abb6b2f253ec67d16917fe945801bf05877ce399cde3ce96"} Mar 13 12:02:24 crc kubenswrapper[4837]: I0313 12:02:24.508366 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f8m9m" event={"ID":"387739fd-caae-44d0-8cbb-50808d69618b","Type":"ContainerStarted","Data":"b41e6ab652b91a272d7eee0125a9021991f1281f0073d450f1287f1aef1cfd76"} Mar 13 12:02:24 crc kubenswrapper[4837]: I0313 12:02:24.509289 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" event={"ID":"c72405c5-2c81-43f4-93c6-f73f9771be8b","Type":"ContainerStarted","Data":"26fb66e61c49f8b8b5a46f19013cb3458098d9ddf498c1690ff836772ed59a46"} Mar 13 12:02:25 crc kubenswrapper[4837]: I0313 12:02:25.174418 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/82a5fe00-90be-47b1-a357-69942f385d4f-memberlist\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:25 crc kubenswrapper[4837]: I0313 12:02:25.179999 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/82a5fe00-90be-47b1-a357-69942f385d4f-memberlist\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:25 crc kubenswrapper[4837]: I0313 12:02:25.341270 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8skdh" Mar 13 12:02:25 crc kubenswrapper[4837]: W0313 12:02:25.361235 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82a5fe00_90be_47b1_a357_69942f385d4f.slice/crio-8a95e0e1133aa61588bf3a7af70f6cf08e0764b1c2a26b8e431cb055e75073ef WatchSource:0}: Error finding container 8a95e0e1133aa61588bf3a7af70f6cf08e0764b1c2a26b8e431cb055e75073ef: Status 404 returned error can't find the container with id 8a95e0e1133aa61588bf3a7af70f6cf08e0764b1c2a26b8e431cb055e75073ef Mar 13 12:02:25 crc kubenswrapper[4837]: I0313 12:02:25.515994 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-zm9dj" event={"ID":"0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88","Type":"ContainerStarted","Data":"89c056d8248ba48802df5e978e56d1e88f2bb66373245aa662decf4a5c20bcd3"} Mar 13 12:02:25 crc kubenswrapper[4837]: I0313 12:02:25.516132 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:25 crc kubenswrapper[4837]: I0313 12:02:25.517093 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8skdh" event={"ID":"82a5fe00-90be-47b1-a357-69942f385d4f","Type":"ContainerStarted","Data":"8a95e0e1133aa61588bf3a7af70f6cf08e0764b1c2a26b8e431cb055e75073ef"} Mar 13 12:02:25 crc kubenswrapper[4837]: I0313 12:02:25.536114 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-zm9dj" podStartSLOduration=2.536094843 podStartE2EDuration="2.536094843s" podCreationTimestamp="2026-03-13 12:02:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:02:25.532986184 +0000 UTC m=+861.171252947" watchObservedRunningTime="2026-03-13 12:02:25.536094843 +0000 UTC m=+861.174361606" Mar 13 12:02:26 crc kubenswrapper[4837]: I0313 12:02:26.532336 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8skdh" event={"ID":"82a5fe00-90be-47b1-a357-69942f385d4f","Type":"ContainerStarted","Data":"5be0eb884d925b3f3e7348d6400f7da8b66f5bdb2380949f71ee452caecf0ef5"} Mar 13 12:02:26 crc kubenswrapper[4837]: I0313 12:02:26.532741 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8skdh" event={"ID":"82a5fe00-90be-47b1-a357-69942f385d4f","Type":"ContainerStarted","Data":"56dbb91f33e12bab2633a967716fbd79b888097682fe4c05115296fba6fda7d9"} Mar 13 12:02:26 crc kubenswrapper[4837]: I0313 12:02:26.560391 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-8skdh" podStartSLOduration=3.560371478 podStartE2EDuration="3.560371478s" podCreationTimestamp="2026-03-13 12:02:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:02:26.556607379 +0000 UTC m=+862.194874142" watchObservedRunningTime="2026-03-13 12:02:26.560371478 +0000 UTC m=+862.198638241" Mar 13 12:02:27 crc kubenswrapper[4837]: I0313 12:02:27.539436 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-8skdh" Mar 13 12:02:31 crc kubenswrapper[4837]: I0313 12:02:31.580679 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" event={"ID":"c72405c5-2c81-43f4-93c6-f73f9771be8b","Type":"ContainerStarted","Data":"fc4ab59ac329b89ecfa18cdec798aa94b9bda1f43bf8a39626b79ce7619cbe23"} Mar 13 12:02:31 crc kubenswrapper[4837]: I0313 12:02:31.581201 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" Mar 13 12:02:31 crc kubenswrapper[4837]: I0313 12:02:31.582810 4837 generic.go:334] "Generic (PLEG): container finished" podID="387739fd-caae-44d0-8cbb-50808d69618b" containerID="35462f4ca915f217aa024a344aa2bc5178b1a67828449d152ba31abfe87cc855" exitCode=0 Mar 13 12:02:31 crc kubenswrapper[4837]: I0313 12:02:31.582840 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f8m9m" event={"ID":"387739fd-caae-44d0-8cbb-50808d69618b","Type":"ContainerDied","Data":"35462f4ca915f217aa024a344aa2bc5178b1a67828449d152ba31abfe87cc855"} Mar 13 12:02:31 crc kubenswrapper[4837]: I0313 12:02:31.600418 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" podStartSLOduration=1.4743894229999999 podStartE2EDuration="8.600399575s" podCreationTimestamp="2026-03-13 12:02:23 +0000 UTC" firstStartedPulling="2026-03-13 12:02:24.156551235 +0000 UTC m=+859.794818008" lastFinishedPulling="2026-03-13 12:02:31.282561397 +0000 UTC m=+866.920828160" observedRunningTime="2026-03-13 12:02:31.598211165 +0000 UTC m=+867.236477938" watchObservedRunningTime="2026-03-13 12:02:31.600399575 +0000 UTC m=+867.238666338" Mar 13 12:02:32 crc kubenswrapper[4837]: I0313 12:02:32.594942 4837 generic.go:334] "Generic (PLEG): container finished" podID="387739fd-caae-44d0-8cbb-50808d69618b" containerID="29e0acbb4a6feb2029d8a9a6dd8c4183f5cd39fe99116e0bc5bbd9c5fe89c086" exitCode=0 Mar 13 12:02:32 crc kubenswrapper[4837]: I0313 12:02:32.595066 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f8m9m" event={"ID":"387739fd-caae-44d0-8cbb-50808d69618b","Type":"ContainerDied","Data":"29e0acbb4a6feb2029d8a9a6dd8c4183f5cd39fe99116e0bc5bbd9c5fe89c086"} Mar 13 12:02:33 crc kubenswrapper[4837]: I0313 12:02:33.601418 4837 generic.go:334] "Generic (PLEG): container finished" podID="387739fd-caae-44d0-8cbb-50808d69618b" containerID="f924d13940cc44e7e612d34c839c313fd8ac1246f6d95b8a6a73e71f8b63be42" exitCode=0 Mar 13 12:02:33 crc kubenswrapper[4837]: I0313 12:02:33.601591 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f8m9m" event={"ID":"387739fd-caae-44d0-8cbb-50808d69618b","Type":"ContainerDied","Data":"f924d13940cc44e7e612d34c839c313fd8ac1246f6d95b8a6a73e71f8b63be42"} Mar 13 12:02:34 crc kubenswrapper[4837]: I0313 12:02:34.612621 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f8m9m" event={"ID":"387739fd-caae-44d0-8cbb-50808d69618b","Type":"ContainerStarted","Data":"853a312f5903550334f26e24dcdd9788b6411dd521712ae569793e38de62f3ac"} Mar 13 12:02:34 crc kubenswrapper[4837]: I0313 12:02:34.612986 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f8m9m" event={"ID":"387739fd-caae-44d0-8cbb-50808d69618b","Type":"ContainerStarted","Data":"853f06af3d709819d2e3489c41e6d0dc6962dd4038a8e0973632ad9645455449"} Mar 13 12:02:34 crc kubenswrapper[4837]: I0313 12:02:34.613000 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f8m9m" event={"ID":"387739fd-caae-44d0-8cbb-50808d69618b","Type":"ContainerStarted","Data":"6e9febbd67d8cfc28862e5ca4062fe2237c13330afe7a73d7ab8fd66b7db3ac1"} Mar 13 12:02:34 crc kubenswrapper[4837]: I0313 12:02:34.613013 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f8m9m" event={"ID":"387739fd-caae-44d0-8cbb-50808d69618b","Type":"ContainerStarted","Data":"6e4b8decb3a41d01d34dddff7572c2acbbf179709004b0067f06773be6e96cad"} Mar 13 12:02:34 crc kubenswrapper[4837]: I0313 12:02:34.613023 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f8m9m" event={"ID":"387739fd-caae-44d0-8cbb-50808d69618b","Type":"ContainerStarted","Data":"4c52a3263ccff805d580c7ba5c486d9728224a126099a3844174a655333fc069"} Mar 13 12:02:35 crc kubenswrapper[4837]: I0313 12:02:35.345549 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-8skdh" Mar 13 12:02:35 crc kubenswrapper[4837]: I0313 12:02:35.627414 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f8m9m" event={"ID":"387739fd-caae-44d0-8cbb-50808d69618b","Type":"ContainerStarted","Data":"bf887ff1cd304fca174852af0fbe35baab6b293e7db603955d32f541111f8d86"} Mar 13 12:02:35 crc kubenswrapper[4837]: I0313 12:02:35.627586 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:35 crc kubenswrapper[4837]: I0313 12:02:35.659593 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-f8m9m" podStartSLOduration=5.809394477 podStartE2EDuration="12.659569662s" podCreationTimestamp="2026-03-13 12:02:23 +0000 UTC" firstStartedPulling="2026-03-13 12:02:24.44650665 +0000 UTC m=+860.084773413" lastFinishedPulling="2026-03-13 12:02:31.296681835 +0000 UTC m=+866.934948598" observedRunningTime="2026-03-13 12:02:35.65635905 +0000 UTC m=+871.294625883" watchObservedRunningTime="2026-03-13 12:02:35.659569662 +0000 UTC m=+871.297836465" Mar 13 12:02:38 crc kubenswrapper[4837]: I0313 12:02:38.061955 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-twrbr"] Mar 13 12:02:38 crc kubenswrapper[4837]: I0313 12:02:38.063190 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-twrbr" Mar 13 12:02:38 crc kubenswrapper[4837]: I0313 12:02:38.067361 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 13 12:02:38 crc kubenswrapper[4837]: I0313 12:02:38.067435 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 13 12:02:38 crc kubenswrapper[4837]: I0313 12:02:38.075924 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-twrbr"] Mar 13 12:02:38 crc kubenswrapper[4837]: I0313 12:02:38.150274 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmg2d\" (UniqueName: \"kubernetes.io/projected/a7e86b46-33ca-4192-92b4-d01e0a74007f-kube-api-access-zmg2d\") pod \"openstack-operator-index-twrbr\" (UID: \"a7e86b46-33ca-4192-92b4-d01e0a74007f\") " pod="openstack-operators/openstack-operator-index-twrbr" Mar 13 12:02:38 crc kubenswrapper[4837]: I0313 12:02:38.251096 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmg2d\" (UniqueName: \"kubernetes.io/projected/a7e86b46-33ca-4192-92b4-d01e0a74007f-kube-api-access-zmg2d\") pod \"openstack-operator-index-twrbr\" (UID: \"a7e86b46-33ca-4192-92b4-d01e0a74007f\") " pod="openstack-operators/openstack-operator-index-twrbr" Mar 13 12:02:38 crc kubenswrapper[4837]: I0313 12:02:38.272094 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmg2d\" (UniqueName: \"kubernetes.io/projected/a7e86b46-33ca-4192-92b4-d01e0a74007f-kube-api-access-zmg2d\") pod \"openstack-operator-index-twrbr\" (UID: \"a7e86b46-33ca-4192-92b4-d01e0a74007f\") " pod="openstack-operators/openstack-operator-index-twrbr" Mar 13 12:02:38 crc kubenswrapper[4837]: I0313 12:02:38.388750 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-twrbr" Mar 13 12:02:38 crc kubenswrapper[4837]: I0313 12:02:38.812618 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-twrbr"] Mar 13 12:02:39 crc kubenswrapper[4837]: I0313 12:02:39.340302 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:39 crc kubenswrapper[4837]: I0313 12:02:39.377221 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:39 crc kubenswrapper[4837]: I0313 12:02:39.666419 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-twrbr" event={"ID":"a7e86b46-33ca-4192-92b4-d01e0a74007f","Type":"ContainerStarted","Data":"43335a5723133f6d0b760252b36ec6f1a304a0d7335bf4c3d664b6185d08440a"} Mar 13 12:02:40 crc kubenswrapper[4837]: I0313 12:02:40.027425 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-twrbr"] Mar 13 12:02:40 crc kubenswrapper[4837]: I0313 12:02:40.430208 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-mdjzs"] Mar 13 12:02:40 crc kubenswrapper[4837]: I0313 12:02:40.430937 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mdjzs" Mar 13 12:02:40 crc kubenswrapper[4837]: I0313 12:02:40.435141 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-z5kdt" Mar 13 12:02:40 crc kubenswrapper[4837]: I0313 12:02:40.440344 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mdjzs"] Mar 13 12:02:40 crc kubenswrapper[4837]: I0313 12:02:40.583735 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n5cg\" (UniqueName: \"kubernetes.io/projected/9da10ec5-aa1b-4797-91ce-04a91266831a-kube-api-access-6n5cg\") pod \"openstack-operator-index-mdjzs\" (UID: \"9da10ec5-aa1b-4797-91ce-04a91266831a\") " pod="openstack-operators/openstack-operator-index-mdjzs" Mar 13 12:02:40 crc kubenswrapper[4837]: I0313 12:02:40.685835 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n5cg\" (UniqueName: \"kubernetes.io/projected/9da10ec5-aa1b-4797-91ce-04a91266831a-kube-api-access-6n5cg\") pod \"openstack-operator-index-mdjzs\" (UID: \"9da10ec5-aa1b-4797-91ce-04a91266831a\") " pod="openstack-operators/openstack-operator-index-mdjzs" Mar 13 12:02:40 crc kubenswrapper[4837]: I0313 12:02:40.708226 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n5cg\" (UniqueName: \"kubernetes.io/projected/9da10ec5-aa1b-4797-91ce-04a91266831a-kube-api-access-6n5cg\") pod \"openstack-operator-index-mdjzs\" (UID: \"9da10ec5-aa1b-4797-91ce-04a91266831a\") " pod="openstack-operators/openstack-operator-index-mdjzs" Mar 13 12:02:40 crc kubenswrapper[4837]: I0313 12:02:40.758000 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mdjzs" Mar 13 12:02:41 crc kubenswrapper[4837]: I0313 12:02:41.575200 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mdjzs"] Mar 13 12:02:41 crc kubenswrapper[4837]: I0313 12:02:41.684614 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-twrbr" event={"ID":"a7e86b46-33ca-4192-92b4-d01e0a74007f","Type":"ContainerStarted","Data":"274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc"} Mar 13 12:02:41 crc kubenswrapper[4837]: I0313 12:02:41.684713 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-twrbr" podUID="a7e86b46-33ca-4192-92b4-d01e0a74007f" containerName="registry-server" containerID="cri-o://274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc" gracePeriod=2 Mar 13 12:02:41 crc kubenswrapper[4837]: I0313 12:02:41.686949 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mdjzs" event={"ID":"9da10ec5-aa1b-4797-91ce-04a91266831a","Type":"ContainerStarted","Data":"f9980e6b3a1501a81063427d9fefa166155c0cd4539721d0622d9171d78b3bf4"} Mar 13 12:02:41 crc kubenswrapper[4837]: I0313 12:02:41.701378 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-twrbr" podStartSLOduration=1.3001593200000001 podStartE2EDuration="3.70135336s" podCreationTimestamp="2026-03-13 12:02:38 +0000 UTC" firstStartedPulling="2026-03-13 12:02:38.825557927 +0000 UTC m=+874.463824710" lastFinishedPulling="2026-03-13 12:02:41.226751987 +0000 UTC m=+876.865018750" observedRunningTime="2026-03-13 12:02:41.69659796 +0000 UTC m=+877.334864723" watchObservedRunningTime="2026-03-13 12:02:41.70135336 +0000 UTC m=+877.339620123" Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.064933 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-twrbr" Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.110201 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmg2d\" (UniqueName: \"kubernetes.io/projected/a7e86b46-33ca-4192-92b4-d01e0a74007f-kube-api-access-zmg2d\") pod \"a7e86b46-33ca-4192-92b4-d01e0a74007f\" (UID: \"a7e86b46-33ca-4192-92b4-d01e0a74007f\") " Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.115284 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7e86b46-33ca-4192-92b4-d01e0a74007f-kube-api-access-zmg2d" (OuterVolumeSpecName: "kube-api-access-zmg2d") pod "a7e86b46-33ca-4192-92b4-d01e0a74007f" (UID: "a7e86b46-33ca-4192-92b4-d01e0a74007f"). InnerVolumeSpecName "kube-api-access-zmg2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.212240 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmg2d\" (UniqueName: \"kubernetes.io/projected/a7e86b46-33ca-4192-92b4-d01e0a74007f-kube-api-access-zmg2d\") on node \"crc\" DevicePath \"\"" Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.694074 4837 generic.go:334] "Generic (PLEG): container finished" podID="a7e86b46-33ca-4192-92b4-d01e0a74007f" containerID="274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc" exitCode=0 Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.694147 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-twrbr" event={"ID":"a7e86b46-33ca-4192-92b4-d01e0a74007f","Type":"ContainerDied","Data":"274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc"} Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.694176 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-twrbr" event={"ID":"a7e86b46-33ca-4192-92b4-d01e0a74007f","Type":"ContainerDied","Data":"43335a5723133f6d0b760252b36ec6f1a304a0d7335bf4c3d664b6185d08440a"} Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.694196 4837 scope.go:117] "RemoveContainer" containerID="274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc" Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.694308 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-twrbr" Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.700606 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mdjzs" event={"ID":"9da10ec5-aa1b-4797-91ce-04a91266831a","Type":"ContainerStarted","Data":"ff8978a385436862c2dd9165a5895fd9b48507db65668e8474bde2b10137cecc"} Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.719994 4837 scope.go:117] "RemoveContainer" containerID="274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc" Mar 13 12:02:42 crc kubenswrapper[4837]: E0313 12:02:42.721881 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc\": container with ID starting with 274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc not found: ID does not exist" containerID="274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc" Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.721927 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc"} err="failed to get container status \"274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc\": rpc error: code = NotFound desc = could not find container \"274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc\": container with ID starting with 274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc not found: ID does not exist" Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.724120 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-mdjzs" podStartSLOduration=2.678480911 podStartE2EDuration="2.724096216s" podCreationTimestamp="2026-03-13 12:02:40 +0000 UTC" firstStartedPulling="2026-03-13 12:02:41.583878639 +0000 UTC m=+877.222145402" lastFinishedPulling="2026-03-13 12:02:41.629493954 +0000 UTC m=+877.267760707" observedRunningTime="2026-03-13 12:02:42.720202913 +0000 UTC m=+878.358469676" watchObservedRunningTime="2026-03-13 12:02:42.724096216 +0000 UTC m=+878.362362999" Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.739009 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-twrbr"] Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.744028 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-twrbr"] Mar 13 12:02:43 crc kubenswrapper[4837]: I0313 12:02:43.057231 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7e86b46-33ca-4192-92b4-d01e0a74007f" path="/var/lib/kubelet/pods/a7e86b46-33ca-4192-92b4-d01e0a74007f/volumes" Mar 13 12:02:43 crc kubenswrapper[4837]: I0313 12:02:43.757815 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" Mar 13 12:02:43 crc kubenswrapper[4837]: I0313 12:02:43.866575 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:44 crc kubenswrapper[4837]: I0313 12:02:44.342364 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:50 crc kubenswrapper[4837]: I0313 12:02:50.758167 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-mdjzs" Mar 13 12:02:50 crc kubenswrapper[4837]: I0313 12:02:50.758763 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-mdjzs" Mar 13 12:02:50 crc kubenswrapper[4837]: I0313 12:02:50.785889 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-mdjzs" Mar 13 12:02:51 crc kubenswrapper[4837]: I0313 12:02:51.776694 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-mdjzs" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.069899 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b"] Mar 13 12:03:04 crc kubenswrapper[4837]: E0313 12:03:04.071347 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e86b46-33ca-4192-92b4-d01e0a74007f" containerName="registry-server" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.071366 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e86b46-33ca-4192-92b4-d01e0a74007f" containerName="registry-server" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.071536 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7e86b46-33ca-4192-92b4-d01e0a74007f" containerName="registry-server" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.072581 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.078223 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b"] Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.078564 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-kf99l" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.087563 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53ac9dfc-487a-47cf-83f2-91542b93bb95-bundle\") pod \"e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b\" (UID: \"53ac9dfc-487a-47cf-83f2-91542b93bb95\") " pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.087699 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r52m\" (UniqueName: \"kubernetes.io/projected/53ac9dfc-487a-47cf-83f2-91542b93bb95-kube-api-access-8r52m\") pod \"e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b\" (UID: \"53ac9dfc-487a-47cf-83f2-91542b93bb95\") " pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.087728 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53ac9dfc-487a-47cf-83f2-91542b93bb95-util\") pod \"e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b\" (UID: \"53ac9dfc-487a-47cf-83f2-91542b93bb95\") " pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.188490 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53ac9dfc-487a-47cf-83f2-91542b93bb95-bundle\") pod \"e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b\" (UID: \"53ac9dfc-487a-47cf-83f2-91542b93bb95\") " pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.188624 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r52m\" (UniqueName: \"kubernetes.io/projected/53ac9dfc-487a-47cf-83f2-91542b93bb95-kube-api-access-8r52m\") pod \"e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b\" (UID: \"53ac9dfc-487a-47cf-83f2-91542b93bb95\") " pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.188688 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53ac9dfc-487a-47cf-83f2-91542b93bb95-util\") pod \"e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b\" (UID: \"53ac9dfc-487a-47cf-83f2-91542b93bb95\") " pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.189308 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53ac9dfc-487a-47cf-83f2-91542b93bb95-bundle\") pod \"e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b\" (UID: \"53ac9dfc-487a-47cf-83f2-91542b93bb95\") " pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.189332 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53ac9dfc-487a-47cf-83f2-91542b93bb95-util\") pod \"e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b\" (UID: \"53ac9dfc-487a-47cf-83f2-91542b93bb95\") " pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.212656 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r52m\" (UniqueName: \"kubernetes.io/projected/53ac9dfc-487a-47cf-83f2-91542b93bb95-kube-api-access-8r52m\") pod \"e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b\" (UID: \"53ac9dfc-487a-47cf-83f2-91542b93bb95\") " pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.391039 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.797836 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b"] Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.835245 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" event={"ID":"53ac9dfc-487a-47cf-83f2-91542b93bb95","Type":"ContainerStarted","Data":"db248d00a2576920070079868eff75c4075eb99329a51ca1c2b9e34722c9b26a"} Mar 13 12:03:05 crc kubenswrapper[4837]: I0313 12:03:05.843760 4837 generic.go:334] "Generic (PLEG): container finished" podID="53ac9dfc-487a-47cf-83f2-91542b93bb95" containerID="a274adcdd19e87ea805b41482270fea6edd2aed6a3b3d5426c7b9c13123d6942" exitCode=0 Mar 13 12:03:05 crc kubenswrapper[4837]: I0313 12:03:05.843875 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" event={"ID":"53ac9dfc-487a-47cf-83f2-91542b93bb95","Type":"ContainerDied","Data":"a274adcdd19e87ea805b41482270fea6edd2aed6a3b3d5426c7b9c13123d6942"} Mar 13 12:03:05 crc kubenswrapper[4837]: I0313 12:03:05.846715 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:03:06 crc kubenswrapper[4837]: I0313 12:03:06.855088 4837 generic.go:334] "Generic (PLEG): container finished" podID="53ac9dfc-487a-47cf-83f2-91542b93bb95" containerID="059c38107b2a0b253e4cfb490b581f58a0086997bd0ae22c09f0f4b699ad0737" exitCode=0 Mar 13 12:03:06 crc kubenswrapper[4837]: I0313 12:03:06.855171 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" event={"ID":"53ac9dfc-487a-47cf-83f2-91542b93bb95","Type":"ContainerDied","Data":"059c38107b2a0b253e4cfb490b581f58a0086997bd0ae22c09f0f4b699ad0737"} Mar 13 12:03:07 crc kubenswrapper[4837]: I0313 12:03:07.865669 4837 generic.go:334] "Generic (PLEG): container finished" podID="53ac9dfc-487a-47cf-83f2-91542b93bb95" containerID="d9cfaabf006620bf7355481966b22d6484289d6fe86e72479ae98c95a57d85b1" exitCode=0 Mar 13 12:03:07 crc kubenswrapper[4837]: I0313 12:03:07.865729 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" event={"ID":"53ac9dfc-487a-47cf-83f2-91542b93bb95","Type":"ContainerDied","Data":"d9cfaabf006620bf7355481966b22d6484289d6fe86e72479ae98c95a57d85b1"} Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.135487 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.250803 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r52m\" (UniqueName: \"kubernetes.io/projected/53ac9dfc-487a-47cf-83f2-91542b93bb95-kube-api-access-8r52m\") pod \"53ac9dfc-487a-47cf-83f2-91542b93bb95\" (UID: \"53ac9dfc-487a-47cf-83f2-91542b93bb95\") " Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.250936 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53ac9dfc-487a-47cf-83f2-91542b93bb95-bundle\") pod \"53ac9dfc-487a-47cf-83f2-91542b93bb95\" (UID: \"53ac9dfc-487a-47cf-83f2-91542b93bb95\") " Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.251041 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53ac9dfc-487a-47cf-83f2-91542b93bb95-util\") pod \"53ac9dfc-487a-47cf-83f2-91542b93bb95\" (UID: \"53ac9dfc-487a-47cf-83f2-91542b93bb95\") " Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.251654 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53ac9dfc-487a-47cf-83f2-91542b93bb95-bundle" (OuterVolumeSpecName: "bundle") pod "53ac9dfc-487a-47cf-83f2-91542b93bb95" (UID: "53ac9dfc-487a-47cf-83f2-91542b93bb95"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.256222 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53ac9dfc-487a-47cf-83f2-91542b93bb95-kube-api-access-8r52m" (OuterVolumeSpecName: "kube-api-access-8r52m") pod "53ac9dfc-487a-47cf-83f2-91542b93bb95" (UID: "53ac9dfc-487a-47cf-83f2-91542b93bb95"). InnerVolumeSpecName "kube-api-access-8r52m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.264561 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53ac9dfc-487a-47cf-83f2-91542b93bb95-util" (OuterVolumeSpecName: "util") pod "53ac9dfc-487a-47cf-83f2-91542b93bb95" (UID: "53ac9dfc-487a-47cf-83f2-91542b93bb95"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.352293 4837 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53ac9dfc-487a-47cf-83f2-91542b93bb95-util\") on node \"crc\" DevicePath \"\"" Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.352578 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r52m\" (UniqueName: \"kubernetes.io/projected/53ac9dfc-487a-47cf-83f2-91542b93bb95-kube-api-access-8r52m\") on node \"crc\" DevicePath \"\"" Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.352773 4837 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53ac9dfc-487a-47cf-83f2-91542b93bb95-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.886004 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" event={"ID":"53ac9dfc-487a-47cf-83f2-91542b93bb95","Type":"ContainerDied","Data":"db248d00a2576920070079868eff75c4075eb99329a51ca1c2b9e34722c9b26a"} Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.886051 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db248d00a2576920070079868eff75c4075eb99329a51ca1c2b9e34722c9b26a" Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.886069 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.112297 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb"] Mar 13 12:03:16 crc kubenswrapper[4837]: E0313 12:03:16.113089 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ac9dfc-487a-47cf-83f2-91542b93bb95" containerName="extract" Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.113107 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ac9dfc-487a-47cf-83f2-91542b93bb95" containerName="extract" Mar 13 12:03:16 crc kubenswrapper[4837]: E0313 12:03:16.113126 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ac9dfc-487a-47cf-83f2-91542b93bb95" containerName="pull" Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.113133 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ac9dfc-487a-47cf-83f2-91542b93bb95" containerName="pull" Mar 13 12:03:16 crc kubenswrapper[4837]: E0313 12:03:16.113147 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ac9dfc-487a-47cf-83f2-91542b93bb95" containerName="util" Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.113155 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ac9dfc-487a-47cf-83f2-91542b93bb95" containerName="util" Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.113301 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="53ac9dfc-487a-47cf-83f2-91542b93bb95" containerName="extract" Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.113857 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb" Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.116565 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-bvswq" Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.132826 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb"] Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.282027 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62h5n\" (UniqueName: \"kubernetes.io/projected/4f8c5e9e-7680-4bc3-8096-0c62a1de4da5-kube-api-access-62h5n\") pod \"openstack-operator-controller-init-c99df78b8-qxmfb\" (UID: \"4f8c5e9e-7680-4bc3-8096-0c62a1de4da5\") " pod="openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb" Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.383370 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62h5n\" (UniqueName: \"kubernetes.io/projected/4f8c5e9e-7680-4bc3-8096-0c62a1de4da5-kube-api-access-62h5n\") pod \"openstack-operator-controller-init-c99df78b8-qxmfb\" (UID: \"4f8c5e9e-7680-4bc3-8096-0c62a1de4da5\") " pod="openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb" Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.408331 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62h5n\" (UniqueName: \"kubernetes.io/projected/4f8c5e9e-7680-4bc3-8096-0c62a1de4da5-kube-api-access-62h5n\") pod \"openstack-operator-controller-init-c99df78b8-qxmfb\" (UID: \"4f8c5e9e-7680-4bc3-8096-0c62a1de4da5\") " pod="openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb" Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.444068 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb" Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.898154 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb"] Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.935497 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb" event={"ID":"4f8c5e9e-7680-4bc3-8096-0c62a1de4da5","Type":"ContainerStarted","Data":"a91152655fd513fa810a620170966830e4d18b0c4296c0a2036388aef535cced"} Mar 13 12:03:21 crc kubenswrapper[4837]: I0313 12:03:21.973066 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb" event={"ID":"4f8c5e9e-7680-4bc3-8096-0c62a1de4da5","Type":"ContainerStarted","Data":"1b1541ed1a3eef95d859359a0a9ace57b247ce760a7f62e559775c7828759bb7"} Mar 13 12:03:21 crc kubenswrapper[4837]: I0313 12:03:21.973910 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb" Mar 13 12:03:22 crc kubenswrapper[4837]: I0313 12:03:22.021254 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb" podStartSLOduration=1.770435438 podStartE2EDuration="6.021223354s" podCreationTimestamp="2026-03-13 12:03:16 +0000 UTC" firstStartedPulling="2026-03-13 12:03:16.913162993 +0000 UTC m=+912.551429756" lastFinishedPulling="2026-03-13 12:03:21.163950909 +0000 UTC m=+916.802217672" observedRunningTime="2026-03-13 12:03:22.010552667 +0000 UTC m=+917.648819440" watchObservedRunningTime="2026-03-13 12:03:22.021223354 +0000 UTC m=+917.659490117" Mar 13 12:03:26 crc kubenswrapper[4837]: I0313 12:03:26.457773 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb" Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.962125 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq"] Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.963362 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq" Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.965429 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-c69xz" Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.976687 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq"] Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.981209 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z"] Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.981964 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z" Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.985021 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-twrwq" Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.989065 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx"] Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.990201 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx" Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.991576 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-2lzrn" Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.993806 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9"] Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.994659 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9" Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.996348 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-2qszj" Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.998370 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.004119 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.009190 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.009960 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.011687 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-gk8bz" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.027986 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.066523 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.070834 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dgth\" (UniqueName: \"kubernetes.io/projected/1870e3ae-40fd-479c-9aa7-9ce3a3e2dd2e-kube-api-access-9dgth\") pod \"glance-operator-controller-manager-5964f64c48-mrgb9\" (UID: \"1870e3ae-40fd-479c-9aa7-9ce3a3e2dd2e\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.070893 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf5qp\" (UniqueName: \"kubernetes.io/projected/0a24601d-8e41-4f99-9e33-870d791a3e7e-kube-api-access-qf5qp\") pod \"cinder-operator-controller-manager-984cd4dcf-kbn8z\" (UID: \"0a24601d-8e41-4f99-9e33-870d791a3e7e\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.070920 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvm59\" (UniqueName: \"kubernetes.io/projected/e645f00a-8463-4fac-b010-f0500b54d68a-kube-api-access-jvm59\") pod \"designate-operator-controller-manager-66d56f6ff4-b7cdx\" (UID: \"e645f00a-8463-4fac-b010-f0500b54d68a\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.070957 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffhls\" (UniqueName: \"kubernetes.io/projected/1d59bb7f-598d-4c70-9b8c-ce4e3048691f-kube-api-access-ffhls\") pod \"barbican-operator-controller-manager-677bd678f7-jvdqq\" (UID: \"1d59bb7f-598d-4c70-9b8c-ce4e3048691f\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.076286 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.094899 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.103234 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-m6md6" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.133924 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.162761 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.163775 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.171246 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.171324 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-dzbzz" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.171859 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mnsd\" (UniqueName: \"kubernetes.io/projected/11a29883-0638-4da4-a1dc-bf2127a3645c-kube-api-access-5mnsd\") pod \"horizon-operator-controller-manager-6d9d6b584d-bvmr7\" (UID: \"11a29883-0638-4da4-a1dc-bf2127a3645c\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.171917 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dgth\" (UniqueName: \"kubernetes.io/projected/1870e3ae-40fd-479c-9aa7-9ce3a3e2dd2e-kube-api-access-9dgth\") pod \"glance-operator-controller-manager-5964f64c48-mrgb9\" (UID: \"1870e3ae-40fd-479c-9aa7-9ce3a3e2dd2e\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.171950 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9jq6\" (UniqueName: \"kubernetes.io/projected/b2c881d7-03db-4608-a3f4-9a9ad8b2f5da-kube-api-access-t9jq6\") pod \"heat-operator-controller-manager-77b6666d85-ss4rm\" (UID: \"b2c881d7-03db-4608-a3f4-9a9ad8b2f5da\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.172017 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf5qp\" (UniqueName: \"kubernetes.io/projected/0a24601d-8e41-4f99-9e33-870d791a3e7e-kube-api-access-qf5qp\") pod \"cinder-operator-controller-manager-984cd4dcf-kbn8z\" (UID: \"0a24601d-8e41-4f99-9e33-870d791a3e7e\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.172053 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvm59\" (UniqueName: \"kubernetes.io/projected/e645f00a-8463-4fac-b010-f0500b54d68a-kube-api-access-jvm59\") pod \"designate-operator-controller-manager-66d56f6ff4-b7cdx\" (UID: \"e645f00a-8463-4fac-b010-f0500b54d68a\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.172108 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffhls\" (UniqueName: \"kubernetes.io/projected/1d59bb7f-598d-4c70-9b8c-ce4e3048691f-kube-api-access-ffhls\") pod \"barbican-operator-controller-manager-677bd678f7-jvdqq\" (UID: \"1d59bb7f-598d-4c70-9b8c-ce4e3048691f\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.199378 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffhls\" (UniqueName: \"kubernetes.io/projected/1d59bb7f-598d-4c70-9b8c-ce4e3048691f-kube-api-access-ffhls\") pod \"barbican-operator-controller-manager-677bd678f7-jvdqq\" (UID: \"1d59bb7f-598d-4c70-9b8c-ce4e3048691f\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.201583 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.201945 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvm59\" (UniqueName: \"kubernetes.io/projected/e645f00a-8463-4fac-b010-f0500b54d68a-kube-api-access-jvm59\") pod \"designate-operator-controller-manager-66d56f6ff4-b7cdx\" (UID: \"e645f00a-8463-4fac-b010-f0500b54d68a\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.202698 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.205685 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-d57j6" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.207264 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf5qp\" (UniqueName: \"kubernetes.io/projected/0a24601d-8e41-4f99-9e33-870d791a3e7e-kube-api-access-qf5qp\") pod \"cinder-operator-controller-manager-984cd4dcf-kbn8z\" (UID: \"0a24601d-8e41-4f99-9e33-870d791a3e7e\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.214552 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dgth\" (UniqueName: \"kubernetes.io/projected/1870e3ae-40fd-479c-9aa7-9ce3a3e2dd2e-kube-api-access-9dgth\") pod \"glance-operator-controller-manager-5964f64c48-mrgb9\" (UID: \"1870e3ae-40fd-479c-9aa7-9ce3a3e2dd2e\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.219280 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.244509 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.264756 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.265943 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.268534 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-cwfjs" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.270274 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.271039 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.273341 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mnsd\" (UniqueName: \"kubernetes.io/projected/11a29883-0638-4da4-a1dc-bf2127a3645c-kube-api-access-5mnsd\") pod \"horizon-operator-controller-manager-6d9d6b584d-bvmr7\" (UID: \"11a29883-0638-4da4-a1dc-bf2127a3645c\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.273457 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfnnw\" (UniqueName: \"kubernetes.io/projected/c19c3466-ab50-4be3-8299-d7b8b3d263df-kube-api-access-jfnnw\") pod \"infra-operator-controller-manager-5995f4446f-fhlk9\" (UID: \"c19c3466-ab50-4be3-8299-d7b8b3d263df\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.273491 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9jq6\" (UniqueName: \"kubernetes.io/projected/b2c881d7-03db-4608-a3f4-9a9ad8b2f5da-kube-api-access-t9jq6\") pod \"heat-operator-controller-manager-77b6666d85-ss4rm\" (UID: \"b2c881d7-03db-4608-a3f4-9a9ad8b2f5da\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.273548 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert\") pod \"infra-operator-controller-manager-5995f4446f-fhlk9\" (UID: \"c19c3466-ab50-4be3-8299-d7b8b3d263df\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.273585 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb5zl\" (UniqueName: \"kubernetes.io/projected/89e6d6f8-7bd3-4862-b41c-cd5c1f05f3e5-kube-api-access-xb5zl\") pod \"ironic-operator-controller-manager-6bbb499bbc-9zvxf\" (UID: \"89e6d6f8-7bd3-4862-b41c-cd5c1f05f3e5\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.278946 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-674gz" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.283371 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.284502 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.306677 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.307852 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.307911 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.308937 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.316998 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.317208 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-9hk47" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.317588 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mnsd\" (UniqueName: \"kubernetes.io/projected/11a29883-0638-4da4-a1dc-bf2127a3645c-kube-api-access-5mnsd\") pod \"horizon-operator-controller-manager-6d9d6b584d-bvmr7\" (UID: \"11a29883-0638-4da4-a1dc-bf2127a3645c\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.329387 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9jq6\" (UniqueName: \"kubernetes.io/projected/b2c881d7-03db-4608-a3f4-9a9ad8b2f5da-kube-api-access-t9jq6\") pod \"heat-operator-controller-manager-77b6666d85-ss4rm\" (UID: \"b2c881d7-03db-4608-a3f4-9a9ad8b2f5da\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.334601 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.335888 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.339714 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556724-st6gn"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.342654 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556724-st6gn" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.345089 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556724-st6gn"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.348576 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.348757 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.348929 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.351401 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.362708 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.363586 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.368842 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-grrds" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.374291 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h2hg\" (UniqueName: \"kubernetes.io/projected/fa1b1ba2-3856-49cb-bda4-8ac5e63b5298-kube-api-access-9h2hg\") pod \"manila-operator-controller-manager-68f45f9d9f-twrg7\" (UID: \"fa1b1ba2-3856-49cb-bda4-8ac5e63b5298\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.374343 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvmnr\" (UniqueName: \"kubernetes.io/projected/9bd066a9-3999-405a-b619-540678a46ded-kube-api-access-xvmnr\") pod \"keystone-operator-controller-manager-684f77d66d-kc2x6\" (UID: \"9bd066a9-3999-405a-b619-540678a46ded\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.374385 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert\") pod \"infra-operator-controller-manager-5995f4446f-fhlk9\" (UID: \"c19c3466-ab50-4be3-8299-d7b8b3d263df\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.374420 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb5zl\" (UniqueName: \"kubernetes.io/projected/89e6d6f8-7bd3-4862-b41c-cd5c1f05f3e5-kube-api-access-xb5zl\") pod \"ironic-operator-controller-manager-6bbb499bbc-9zvxf\" (UID: \"89e6d6f8-7bd3-4862-b41c-cd5c1f05f3e5\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.374457 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvfhj\" (UniqueName: \"kubernetes.io/projected/046bdee0-f0cf-4d17-916b-68d301502473-kube-api-access-kvfhj\") pod \"mariadb-operator-controller-manager-658d4cdd5-7nm95\" (UID: \"046bdee0-f0cf-4d17-916b-68d301502473\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.374520 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfnnw\" (UniqueName: \"kubernetes.io/projected/c19c3466-ab50-4be3-8299-d7b8b3d263df-kube-api-access-jfnnw\") pod \"infra-operator-controller-manager-5995f4446f-fhlk9\" (UID: \"c19c3466-ab50-4be3-8299-d7b8b3d263df\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:00 crc kubenswrapper[4837]: E0313 12:04:00.374968 4837 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 12:04:00 crc kubenswrapper[4837]: E0313 12:04:00.375018 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert podName:c19c3466-ab50-4be3-8299-d7b8b3d263df nodeName:}" failed. No retries permitted until 2026-03-13 12:04:00.875001849 +0000 UTC m=+956.513268612 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert") pod "infra-operator-controller-manager-5995f4446f-fhlk9" (UID: "c19c3466-ab50-4be3-8299-d7b8b3d263df") : secret "infra-operator-webhook-server-cert" not found Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.391648 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.394615 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb5zl\" (UniqueName: \"kubernetes.io/projected/89e6d6f8-7bd3-4862-b41c-cd5c1f05f3e5-kube-api-access-xb5zl\") pod \"ironic-operator-controller-manager-6bbb499bbc-9zvxf\" (UID: \"89e6d6f8-7bd3-4862-b41c-cd5c1f05f3e5\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.395703 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfnnw\" (UniqueName: \"kubernetes.io/projected/c19c3466-ab50-4be3-8299-d7b8b3d263df-kube-api-access-jfnnw\") pod \"infra-operator-controller-manager-5995f4446f-fhlk9\" (UID: \"c19c3466-ab50-4be3-8299-d7b8b3d263df\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.396558 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.397489 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.401492 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.402275 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-k657h" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.408445 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.412252 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.418814 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-69bgg" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.420602 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.422312 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.427649 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.432936 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.435824 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.437722 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.438499 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.441074 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.441580 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-qzn2w" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.441763 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-5pgmr" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.448067 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.453435 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.468355 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.468434 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.471442 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.473223 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-k86kd" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.475725 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9b48\" (UniqueName: \"kubernetes.io/projected/ee1c592d-7979-4b75-b8e4-7ccd6d7d6048-kube-api-access-p9b48\") pod \"nova-operator-controller-manager-569cc54c5-shrx7\" (UID: \"ee1c592d-7979-4b75-b8e4-7ccd6d7d6048\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.475777 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvpdw\" (UniqueName: \"kubernetes.io/projected/561aed86-f289-4dd1-8c53-307ccdc99165-kube-api-access-pvpdw\") pod \"octavia-operator-controller-manager-5f4f55cb5c-7f7zd\" (UID: \"561aed86-f289-4dd1-8c53-307ccdc99165\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.475839 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c85bh\" (UniqueName: \"kubernetes.io/projected/8bda3181-d107-4de8-b754-e5e67dd8dd9c-kube-api-access-c85bh\") pod \"auto-csr-approver-29556724-st6gn\" (UID: \"8bda3181-d107-4de8-b754-e5e67dd8dd9c\") " pod="openshift-infra/auto-csr-approver-29556724-st6gn" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.475897 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h2hg\" (UniqueName: \"kubernetes.io/projected/fa1b1ba2-3856-49cb-bda4-8ac5e63b5298-kube-api-access-9h2hg\") pod \"manila-operator-controller-manager-68f45f9d9f-twrg7\" (UID: \"fa1b1ba2-3856-49cb-bda4-8ac5e63b5298\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.475921 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvmnr\" (UniqueName: \"kubernetes.io/projected/9bd066a9-3999-405a-b619-540678a46ded-kube-api-access-xvmnr\") pod \"keystone-operator-controller-manager-684f77d66d-kc2x6\" (UID: \"9bd066a9-3999-405a-b619-540678a46ded\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.475947 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4g74\" (UniqueName: \"kubernetes.io/projected/3059d7c0-2624-4d3e-af0f-de054401f1ec-kube-api-access-j4g74\") pod \"neutron-operator-controller-manager-776c5696bf-6ht9l\" (UID: \"3059d7c0-2624-4d3e-af0f-de054401f1ec\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.475973 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b77x9vc\" (UID: \"7b38159c-e030-4734-963d-dfc38d29c75c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.475999 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54dqh\" (UniqueName: \"kubernetes.io/projected/7b38159c-e030-4734-963d-dfc38d29c75c-kube-api-access-54dqh\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b77x9vc\" (UID: \"7b38159c-e030-4734-963d-dfc38d29c75c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.476062 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvfhj\" (UniqueName: \"kubernetes.io/projected/046bdee0-f0cf-4d17-916b-68d301502473-kube-api-access-kvfhj\") pod \"mariadb-operator-controller-manager-658d4cdd5-7nm95\" (UID: \"046bdee0-f0cf-4d17-916b-68d301502473\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.478117 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.485463 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.494183 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-zkdlz" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.500606 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.515575 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvfhj\" (UniqueName: \"kubernetes.io/projected/046bdee0-f0cf-4d17-916b-68d301502473-kube-api-access-kvfhj\") pod \"mariadb-operator-controller-manager-658d4cdd5-7nm95\" (UID: \"046bdee0-f0cf-4d17-916b-68d301502473\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.519177 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.545287 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.550258 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-wkl4s" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.554155 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvmnr\" (UniqueName: \"kubernetes.io/projected/9bd066a9-3999-405a-b619-540678a46ded-kube-api-access-xvmnr\") pod \"keystone-operator-controller-manager-684f77d66d-kc2x6\" (UID: \"9bd066a9-3999-405a-b619-540678a46ded\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.574510 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h2hg\" (UniqueName: \"kubernetes.io/projected/fa1b1ba2-3856-49cb-bda4-8ac5e63b5298-kube-api-access-9h2hg\") pod \"manila-operator-controller-manager-68f45f9d9f-twrg7\" (UID: \"fa1b1ba2-3856-49cb-bda4-8ac5e63b5298\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.581333 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7nhg\" (UniqueName: \"kubernetes.io/projected/55649f1c-678e-4e03-be55-7c4435446199-kube-api-access-g7nhg\") pod \"swift-operator-controller-manager-677c674df7-cfv8z\" (UID: \"55649f1c-678e-4e03-be55-7c4435446199\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.581392 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swds9\" (UniqueName: \"kubernetes.io/projected/cb20db22-bd0e-4897-8ed6-a6a80a91ffff-kube-api-access-swds9\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-8lkmx\" (UID: \"cb20db22-bd0e-4897-8ed6-a6a80a91ffff\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.581415 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4g74\" (UniqueName: \"kubernetes.io/projected/3059d7c0-2624-4d3e-af0f-de054401f1ec-kube-api-access-j4g74\") pod \"neutron-operator-controller-manager-776c5696bf-6ht9l\" (UID: \"3059d7c0-2624-4d3e-af0f-de054401f1ec\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.581433 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b77x9vc\" (UID: \"7b38159c-e030-4734-963d-dfc38d29c75c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.581453 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54dqh\" (UniqueName: \"kubernetes.io/projected/7b38159c-e030-4734-963d-dfc38d29c75c-kube-api-access-54dqh\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b77x9vc\" (UID: \"7b38159c-e030-4734-963d-dfc38d29c75c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.581499 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr8mf\" (UniqueName: \"kubernetes.io/projected/35a21ab1-95b5-446a-ae10-d004e5aa2995-kube-api-access-zr8mf\") pod \"placement-operator-controller-manager-574d45c66c-fwblp\" (UID: \"35a21ab1-95b5-446a-ae10-d004e5aa2995\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.581522 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9b48\" (UniqueName: \"kubernetes.io/projected/ee1c592d-7979-4b75-b8e4-7ccd6d7d6048-kube-api-access-p9b48\") pod \"nova-operator-controller-manager-569cc54c5-shrx7\" (UID: \"ee1c592d-7979-4b75-b8e4-7ccd6d7d6048\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.581544 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvpdw\" (UniqueName: \"kubernetes.io/projected/561aed86-f289-4dd1-8c53-307ccdc99165-kube-api-access-pvpdw\") pod \"octavia-operator-controller-manager-5f4f55cb5c-7f7zd\" (UID: \"561aed86-f289-4dd1-8c53-307ccdc99165\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.581580 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c85bh\" (UniqueName: \"kubernetes.io/projected/8bda3181-d107-4de8-b754-e5e67dd8dd9c-kube-api-access-c85bh\") pod \"auto-csr-approver-29556724-st6gn\" (UID: \"8bda3181-d107-4de8-b754-e5e67dd8dd9c\") " pod="openshift-infra/auto-csr-approver-29556724-st6gn" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.581603 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6n5d\" (UniqueName: \"kubernetes.io/projected/5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d-kube-api-access-k6n5d\") pod \"ovn-operator-controller-manager-bbc5b68f9-nxwr9\" (UID: \"5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.589000 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf" Mar 13 12:04:00 crc kubenswrapper[4837]: E0313 12:04:00.589896 4837 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:04:00 crc kubenswrapper[4837]: E0313 12:04:00.590015 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert podName:7b38159c-e030-4734-963d-dfc38d29c75c nodeName:}" failed. No retries permitted until 2026-03-13 12:04:01.089970358 +0000 UTC m=+956.728237121 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" (UID: "7b38159c-e030-4734-963d-dfc38d29c75c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.609627 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.633985 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4g74\" (UniqueName: \"kubernetes.io/projected/3059d7c0-2624-4d3e-af0f-de054401f1ec-kube-api-access-j4g74\") pod \"neutron-operator-controller-manager-776c5696bf-6ht9l\" (UID: \"3059d7c0-2624-4d3e-af0f-de054401f1ec\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.635245 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c85bh\" (UniqueName: \"kubernetes.io/projected/8bda3181-d107-4de8-b754-e5e67dd8dd9c-kube-api-access-c85bh\") pod \"auto-csr-approver-29556724-st6gn\" (UID: \"8bda3181-d107-4de8-b754-e5e67dd8dd9c\") " pod="openshift-infra/auto-csr-approver-29556724-st6gn" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.643422 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvpdw\" (UniqueName: \"kubernetes.io/projected/561aed86-f289-4dd1-8c53-307ccdc99165-kube-api-access-pvpdw\") pod \"octavia-operator-controller-manager-5f4f55cb5c-7f7zd\" (UID: \"561aed86-f289-4dd1-8c53-307ccdc99165\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.645576 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.651165 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.652230 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54dqh\" (UniqueName: \"kubernetes.io/projected/7b38159c-e030-4734-963d-dfc38d29c75c-kube-api-access-54dqh\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b77x9vc\" (UID: \"7b38159c-e030-4734-963d-dfc38d29c75c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.654332 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.658311 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-t6g4b" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.667497 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9b48\" (UniqueName: \"kubernetes.io/projected/ee1c592d-7979-4b75-b8e4-7ccd6d7d6048-kube-api-access-p9b48\") pod \"nova-operator-controller-manager-569cc54c5-shrx7\" (UID: \"ee1c592d-7979-4b75-b8e4-7ccd6d7d6048\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.679256 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.680391 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.691243 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-q48z8" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.700493 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.702695 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.715851 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6n5d\" (UniqueName: \"kubernetes.io/projected/5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d-kube-api-access-k6n5d\") pod \"ovn-operator-controller-manager-bbc5b68f9-nxwr9\" (UID: \"5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.715900 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7nhg\" (UniqueName: \"kubernetes.io/projected/55649f1c-678e-4e03-be55-7c4435446199-kube-api-access-g7nhg\") pod \"swift-operator-controller-manager-677c674df7-cfv8z\" (UID: \"55649f1c-678e-4e03-be55-7c4435446199\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.715931 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swds9\" (UniqueName: \"kubernetes.io/projected/cb20db22-bd0e-4897-8ed6-a6a80a91ffff-kube-api-access-swds9\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-8lkmx\" (UID: \"cb20db22-bd0e-4897-8ed6-a6a80a91ffff\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.715999 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8v8b\" (UniqueName: \"kubernetes.io/projected/fe107e39-b5ec-473d-8851-b57775dadafc-kube-api-access-c8v8b\") pod \"test-operator-controller-manager-5c5cb9c4d7-dk4nr\" (UID: \"fe107e39-b5ec-473d-8851-b57775dadafc\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.716038 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr8mf\" (UniqueName: \"kubernetes.io/projected/35a21ab1-95b5-446a-ae10-d004e5aa2995-kube-api-access-zr8mf\") pod \"placement-operator-controller-manager-574d45c66c-fwblp\" (UID: \"35a21ab1-95b5-446a-ae10-d004e5aa2995\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.721533 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.733391 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.740971 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556724-st6gn" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.754052 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr8mf\" (UniqueName: \"kubernetes.io/projected/35a21ab1-95b5-446a-ae10-d004e5aa2995-kube-api-access-zr8mf\") pod \"placement-operator-controller-manager-574d45c66c-fwblp\" (UID: \"35a21ab1-95b5-446a-ae10-d004e5aa2995\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.757042 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7nhg\" (UniqueName: \"kubernetes.io/projected/55649f1c-678e-4e03-be55-7c4435446199-kube-api-access-g7nhg\") pod \"swift-operator-controller-manager-677c674df7-cfv8z\" (UID: \"55649f1c-678e-4e03-be55-7c4435446199\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.764161 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swds9\" (UniqueName: \"kubernetes.io/projected/cb20db22-bd0e-4897-8ed6-a6a80a91ffff-kube-api-access-swds9\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-8lkmx\" (UID: \"cb20db22-bd0e-4897-8ed6-a6a80a91ffff\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.767368 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6n5d\" (UniqueName: \"kubernetes.io/projected/5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d-kube-api-access-k6n5d\") pod \"ovn-operator-controller-manager-bbc5b68f9-nxwr9\" (UID: \"5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.768078 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.777195 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.783671 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.783782 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.789508 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.801709 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.801901 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.802330 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-x6jmh" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.824082 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8v8b\" (UniqueName: \"kubernetes.io/projected/fe107e39-b5ec-473d-8851-b57775dadafc-kube-api-access-c8v8b\") pod \"test-operator-controller-manager-5c5cb9c4d7-dk4nr\" (UID: \"fe107e39-b5ec-473d-8851-b57775dadafc\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.824254 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-984ct\" (UniqueName: \"kubernetes.io/projected/5ef20b1d-5c03-4993-b635-b031ddcab3bf-kube-api-access-984ct\") pod \"watcher-operator-controller-manager-6dd88c6f67-hrcp9\" (UID: \"5ef20b1d-5c03-4993-b635-b031ddcab3bf\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.831333 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.839229 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkk4z"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.840302 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkk4z" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.843802 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-6xrg5" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.861068 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkk4z"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.869032 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8v8b\" (UniqueName: \"kubernetes.io/projected/fe107e39-b5ec-473d-8851-b57775dadafc-kube-api-access-c8v8b\") pod \"test-operator-controller-manager-5c5cb9c4d7-dk4nr\" (UID: \"fe107e39-b5ec-473d-8851-b57775dadafc\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.895246 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.906969 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.925428 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7wqr\" (UniqueName: \"kubernetes.io/projected/eaf3fa29-f441-43df-9fbe-409d9d8ad871-kube-api-access-v7wqr\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.925486 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-984ct\" (UniqueName: \"kubernetes.io/projected/5ef20b1d-5c03-4993-b635-b031ddcab3bf-kube-api-access-984ct\") pod \"watcher-operator-controller-manager-6dd88c6f67-hrcp9\" (UID: \"5ef20b1d-5c03-4993-b635-b031ddcab3bf\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.925695 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.925746 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert\") pod \"infra-operator-controller-manager-5995f4446f-fhlk9\" (UID: \"c19c3466-ab50-4be3-8299-d7b8b3d263df\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.925783 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:00 crc kubenswrapper[4837]: E0313 12:04:00.925943 4837 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.925946 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dls2b\" (UniqueName: \"kubernetes.io/projected/ce0c89e1-3fc0-473d-875f-461c8b423061-kube-api-access-dls2b\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xkk4z\" (UID: \"ce0c89e1-3fc0-473d-875f-461c8b423061\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkk4z" Mar 13 12:04:00 crc kubenswrapper[4837]: E0313 12:04:00.926002 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert podName:c19c3466-ab50-4be3-8299-d7b8b3d263df nodeName:}" failed. No retries permitted until 2026-03-13 12:04:01.925979842 +0000 UTC m=+957.564246665 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert") pod "infra-operator-controller-manager-5995f4446f-fhlk9" (UID: "c19c3466-ab50-4be3-8299-d7b8b3d263df") : secret "infra-operator-webhook-server-cert" not found Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.939689 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.946462 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-984ct\" (UniqueName: \"kubernetes.io/projected/5ef20b1d-5c03-4993-b635-b031ddcab3bf-kube-api-access-984ct\") pod \"watcher-operator-controller-manager-6dd88c6f67-hrcp9\" (UID: \"5ef20b1d-5c03-4993-b635-b031ddcab3bf\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.975082 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.027112 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.027436 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:01 crc kubenswrapper[4837]: E0313 12:04:01.027477 4837 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.027492 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dls2b\" (UniqueName: \"kubernetes.io/projected/ce0c89e1-3fc0-473d-875f-461c8b423061-kube-api-access-dls2b\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xkk4z\" (UID: \"ce0c89e1-3fc0-473d-875f-461c8b423061\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkk4z" Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.027514 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7wqr\" (UniqueName: \"kubernetes.io/projected/eaf3fa29-f441-43df-9fbe-409d9d8ad871-kube-api-access-v7wqr\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:01 crc kubenswrapper[4837]: E0313 12:04:01.027530 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs podName:eaf3fa29-f441-43df-9fbe-409d9d8ad871 nodeName:}" failed. No retries permitted until 2026-03-13 12:04:01.527514278 +0000 UTC m=+957.165781041 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs") pod "openstack-operator-controller-manager-55876d85bb-96mp7" (UID: "eaf3fa29-f441-43df-9fbe-409d9d8ad871") : secret "webhook-server-cert" not found Mar 13 12:04:01 crc kubenswrapper[4837]: E0313 12:04:01.027716 4837 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 12:04:01 crc kubenswrapper[4837]: E0313 12:04:01.027754 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs podName:eaf3fa29-f441-43df-9fbe-409d9d8ad871 nodeName:}" failed. No retries permitted until 2026-03-13 12:04:01.527742375 +0000 UTC m=+957.166009138 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs") pod "openstack-operator-controller-manager-55876d85bb-96mp7" (UID: "eaf3fa29-f441-43df-9fbe-409d9d8ad871") : secret "metrics-server-cert" not found Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.049317 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7wqr\" (UniqueName: \"kubernetes.io/projected/eaf3fa29-f441-43df-9fbe-409d9d8ad871-kube-api-access-v7wqr\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.053146 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dls2b\" (UniqueName: \"kubernetes.io/projected/ce0c89e1-3fc0-473d-875f-461c8b423061-kube-api-access-dls2b\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xkk4z\" (UID: \"ce0c89e1-3fc0-473d-875f-461c8b423061\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkk4z" Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.074168 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.109273 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.129321 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b77x9vc\" (UID: \"7b38159c-e030-4734-963d-dfc38d29c75c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:01 crc kubenswrapper[4837]: E0313 12:04:01.129563 4837 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:04:01 crc kubenswrapper[4837]: E0313 12:04:01.129618 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert podName:7b38159c-e030-4734-963d-dfc38d29c75c nodeName:}" failed. No retries permitted until 2026-03-13 12:04:02.129600932 +0000 UTC m=+957.767867695 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" (UID: "7b38159c-e030-4734-963d-dfc38d29c75c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.186512 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkk4z" Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.293050 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq" event={"ID":"1d59bb7f-598d-4c70-9b8c-ce4e3048691f","Type":"ContainerStarted","Data":"7d8e7bfd32eadcbd104d30b096182e4677f1eafdf69317cc33edd58d9d0b72f9"} Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.362128 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z"] Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.369272 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9"] Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.381386 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx"] Mar 13 12:04:01 crc kubenswrapper[4837]: W0313 12:04:01.414222 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode645f00a_8463_4fac_b010_f0500b54d68a.slice/crio-c980a6655ad19b2727eca7a82807c2b6b75a428e5c486aad061ecb8b214b4cb8 WatchSource:0}: Error finding container c980a6655ad19b2727eca7a82807c2b6b75a428e5c486aad061ecb8b214b4cb8: Status 404 returned error can't find the container with id c980a6655ad19b2727eca7a82807c2b6b75a428e5c486aad061ecb8b214b4cb8 Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.543841 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.544215 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:01 crc kubenswrapper[4837]: E0313 12:04:01.544012 4837 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 12:04:01 crc kubenswrapper[4837]: E0313 12:04:01.544362 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs podName:eaf3fa29-f441-43df-9fbe-409d9d8ad871 nodeName:}" failed. No retries permitted until 2026-03-13 12:04:02.544348349 +0000 UTC m=+958.182615102 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs") pod "openstack-operator-controller-manager-55876d85bb-96mp7" (UID: "eaf3fa29-f441-43df-9fbe-409d9d8ad871") : secret "webhook-server-cert" not found Mar 13 12:04:01 crc kubenswrapper[4837]: E0313 12:04:01.544317 4837 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 12:04:01 crc kubenswrapper[4837]: E0313 12:04:01.544771 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs podName:eaf3fa29-f441-43df-9fbe-409d9d8ad871 nodeName:}" failed. No retries permitted until 2026-03-13 12:04:02.544760062 +0000 UTC m=+958.183026825 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs") pod "openstack-operator-controller-manager-55876d85bb-96mp7" (UID: "eaf3fa29-f441-43df-9fbe-409d9d8ad871") : secret "metrics-server-cert" not found Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.787716 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm"] Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.817091 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6"] Mar 13 12:04:01 crc kubenswrapper[4837]: W0313 12:04:01.840062 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11a29883_0638_4da4_a1dc_bf2127a3645c.slice/crio-df846e2fca5a361a69d7a486a9114ce8ebd6cfc85586f65cbad53dab6327bc7e WatchSource:0}: Error finding container df846e2fca5a361a69d7a486a9114ce8ebd6cfc85586f65cbad53dab6327bc7e: Status 404 returned error can't find the container with id df846e2fca5a361a69d7a486a9114ce8ebd6cfc85586f65cbad53dab6327bc7e Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.857697 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf"] Mar 13 12:04:01 crc kubenswrapper[4837]: W0313 12:04:01.868291 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod046bdee0_f0cf_4d17_916b_68d301502473.slice/crio-973a134af6d2e986eb360f51b4cb55f76448d07d5437dcebe10cfd92ee2b761e WatchSource:0}: Error finding container 973a134af6d2e986eb360f51b4cb55f76448d07d5437dcebe10cfd92ee2b761e: Status 404 returned error can't find the container with id 973a134af6d2e986eb360f51b4cb55f76448d07d5437dcebe10cfd92ee2b761e Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.877794 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7"] Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.882167 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7"] Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.886207 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556724-st6gn"] Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.890043 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7"] Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.896240 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd"] Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.901841 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95"] Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.949970 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert\") pod \"infra-operator-controller-manager-5995f4446f-fhlk9\" (UID: \"c19c3466-ab50-4be3-8299-d7b8b3d263df\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:01 crc kubenswrapper[4837]: E0313 12:04:01.950145 4837 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 12:04:01 crc kubenswrapper[4837]: E0313 12:04:01.950249 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert podName:c19c3466-ab50-4be3-8299-d7b8b3d263df nodeName:}" failed. No retries permitted until 2026-03-13 12:04:03.950221965 +0000 UTC m=+959.588488798 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert") pod "infra-operator-controller-manager-5995f4446f-fhlk9" (UID: "c19c3466-ab50-4be3-8299-d7b8b3d263df") : secret "infra-operator-webhook-server-cert" not found Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.106463 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p6qtb"] Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.108038 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.123446 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p6qtb"] Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.153260 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b77x9vc\" (UID: \"7b38159c-e030-4734-963d-dfc38d29c75c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.154596 4837 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.154671 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert podName:7b38159c-e030-4734-963d-dfc38d29c75c nodeName:}" failed. No retries permitted until 2026-03-13 12:04:04.154653971 +0000 UTC m=+959.792920744 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" (UID: "7b38159c-e030-4734-963d-dfc38d29c75c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.175982 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l"] Mar 13 12:04:02 crc kubenswrapper[4837]: W0313 12:04:02.192356 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3059d7c0_2624_4d3e_af0f_de054401f1ec.slice/crio-d2925738557f48a6fb3397d20ead57f90592cd0526a571448f888a7af3946718 WatchSource:0}: Error finding container d2925738557f48a6fb3397d20ead57f90592cd0526a571448f888a7af3946718: Status 404 returned error can't find the container with id d2925738557f48a6fb3397d20ead57f90592cd0526a571448f888a7af3946718 Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.193792 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkk4z"] Mar 13 12:04:02 crc kubenswrapper[4837]: W0313 12:04:02.199354 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce0c89e1_3fc0_473d_875f_461c8b423061.slice/crio-0c846e500d27f6f332ef7b69c630d892f6f55ac4ac50d2ebaf9abf3ba3b5db99 WatchSource:0}: Error finding container 0c846e500d27f6f332ef7b69c630d892f6f55ac4ac50d2ebaf9abf3ba3b5db99: Status 404 returned error can't find the container with id 0c846e500d27f6f332ef7b69c630d892f6f55ac4ac50d2ebaf9abf3ba3b5db99 Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.212601 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx"] Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.218112 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-swds9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6cd66dbd4b-8lkmx_openstack-operators(cb20db22-bd0e-4897-8ed6-a6a80a91ffff): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.219463 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" podUID="cb20db22-bd0e-4897-8ed6-a6a80a91ffff" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.229117 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k6n5d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bbc5b68f9-nxwr9_openstack-operators(5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.230699 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" podUID="5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.231608 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zr8mf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-574d45c66c-fwblp_openstack-operators(35a21ab1-95b5-446a-ae10-d004e5aa2995): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.231782 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g7nhg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-677c674df7-cfv8z_openstack-operators(55649f1c-678e-4e03-be55-7c4435446199): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.232766 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" podUID="35a21ab1-95b5-446a-ae10-d004e5aa2995" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.233070 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" podUID="55649f1c-678e-4e03-be55-7c4435446199" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.233086 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9"] Mar 13 12:04:02 crc kubenswrapper[4837]: W0313 12:04:02.238412 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ef20b1d_5c03_4993_b635_b031ddcab3bf.slice/crio-dcc1d8f44237cca3f04c246ddf270962766cd1585c909eabc90c7c0bbbb9480a WatchSource:0}: Error finding container dcc1d8f44237cca3f04c246ddf270962766cd1585c909eabc90c7c0bbbb9480a: Status 404 returned error can't find the container with id dcc1d8f44237cca3f04c246ddf270962766cd1585c909eabc90c7c0bbbb9480a Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.238603 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z"] Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.242239 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c8v8b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-dk4nr_openstack-operators(fe107e39-b5ec-473d-8851-b57775dadafc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.242455 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-984ct,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6dd88c6f67-hrcp9_openstack-operators(5ef20b1d-5c03-4993-b635-b031ddcab3bf): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.243559 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" podUID="5ef20b1d-5c03-4993-b635-b031ddcab3bf" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.243559 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" podUID="fe107e39-b5ec-473d-8851-b57775dadafc" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.244993 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr"] Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.254361 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebbc8197-2f60-4876-8bab-ae450e22db4d-utilities\") pod \"community-operators-p6qtb\" (UID: \"ebbc8197-2f60-4876-8bab-ae450e22db4d\") " pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.254444 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebbc8197-2f60-4876-8bab-ae450e22db4d-catalog-content\") pod \"community-operators-p6qtb\" (UID: \"ebbc8197-2f60-4876-8bab-ae450e22db4d\") " pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.254470 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp64z\" (UniqueName: \"kubernetes.io/projected/ebbc8197-2f60-4876-8bab-ae450e22db4d-kube-api-access-sp64z\") pod \"community-operators-p6qtb\" (UID: \"ebbc8197-2f60-4876-8bab-ae450e22db4d\") " pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.255063 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp"] Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.268784 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9"] Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.304137 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd" event={"ID":"561aed86-f289-4dd1-8c53-307ccdc99165","Type":"ContainerStarted","Data":"7a2ea750319e0d08f2039479d3b5aa46a4b3ffd7adb15c905ea1b11047e76944"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.305358 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" event={"ID":"35a21ab1-95b5-446a-ae10-d004e5aa2995","Type":"ContainerStarted","Data":"09aea184fa206c9de8e20fb65da85b6e009c3c5562b9a3b0687a0469c5882e23"} Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.307961 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" podUID="35a21ab1-95b5-446a-ae10-d004e5aa2995" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.311744 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l" event={"ID":"3059d7c0-2624-4d3e-af0f-de054401f1ec","Type":"ContainerStarted","Data":"d2925738557f48a6fb3397d20ead57f90592cd0526a571448f888a7af3946718"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.313857 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" event={"ID":"5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d","Type":"ContainerStarted","Data":"cf3854daf3db7c6d695a603af6be0d7698a92c72d53f54446de0440f3118ab70"} Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.315957 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" podUID="5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.316430 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z" event={"ID":"0a24601d-8e41-4f99-9e33-870d791a3e7e","Type":"ContainerStarted","Data":"1ee78c3d4f4d935dc6a2005765aaf1b8b7e393e23a68176976e6f0476c89191e"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.327497 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6" event={"ID":"9bd066a9-3999-405a-b619-540678a46ded","Type":"ContainerStarted","Data":"34ea3ad6afe043b5e68616aa108e8badd29dadcf06d4e6a237885c1435d8fe29"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.332655 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" event={"ID":"cb20db22-bd0e-4897-8ed6-a6a80a91ffff","Type":"ContainerStarted","Data":"594079f055b4e810263cce9b8b863247718bf8f2b9c52e011f72ec5413e696b1"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.334129 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7" event={"ID":"fa1b1ba2-3856-49cb-bda4-8ac5e63b5298","Type":"ContainerStarted","Data":"c15d9bec887caed298d4c5a009f3c7ef6558cf7a7c93b4119b2424a7a24efd8d"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.335397 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9" event={"ID":"1870e3ae-40fd-479c-9aa7-9ce3a3e2dd2e","Type":"ContainerStarted","Data":"8ae3541fd730ec7b2211ecae609278323b121bcfd0b7fa803ac1fe83c5cb7824"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.336454 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf" event={"ID":"89e6d6f8-7bd3-4862-b41c-cd5c1f05f3e5","Type":"ContainerStarted","Data":"9c912fc0484b61004845556a1bee08639c3644b0b0868268e2c0adaa089e0521"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.338554 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556724-st6gn" event={"ID":"8bda3181-d107-4de8-b754-e5e67dd8dd9c","Type":"ContainerStarted","Data":"c507ea9ace1bdacbbcb4871524163d9cbeadd2ea422f02492bfe1b29dd12a90a"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.340242 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkk4z" event={"ID":"ce0c89e1-3fc0-473d-875f-461c8b423061","Type":"ContainerStarted","Data":"0c846e500d27f6f332ef7b69c630d892f6f55ac4ac50d2ebaf9abf3ba3b5db99"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.341206 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7" event={"ID":"ee1c592d-7979-4b75-b8e4-7ccd6d7d6048","Type":"ContainerStarted","Data":"899620c325d5224c6c75f31743d65a6033d26885415637cbb61b879f3c5bbec2"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.344165 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx" event={"ID":"e645f00a-8463-4fac-b010-f0500b54d68a","Type":"ContainerStarted","Data":"c980a6655ad19b2727eca7a82807c2b6b75a428e5c486aad061ecb8b214b4cb8"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.347378 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95" event={"ID":"046bdee0-f0cf-4d17-916b-68d301502473","Type":"ContainerStarted","Data":"973a134af6d2e986eb360f51b4cb55f76448d07d5437dcebe10cfd92ee2b761e"} Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.347797 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" podUID="cb20db22-bd0e-4897-8ed6-a6a80a91ffff" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.349252 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" event={"ID":"5ef20b1d-5c03-4993-b635-b031ddcab3bf","Type":"ContainerStarted","Data":"dcc1d8f44237cca3f04c246ddf270962766cd1585c909eabc90c7c0bbbb9480a"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.351362 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7" event={"ID":"11a29883-0638-4da4-a1dc-bf2127a3645c","Type":"ContainerStarted","Data":"df846e2fca5a361a69d7a486a9114ce8ebd6cfc85586f65cbad53dab6327bc7e"} Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.372468 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" podUID="5ef20b1d-5c03-4993-b635-b031ddcab3bf" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.374612 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebbc8197-2f60-4876-8bab-ae450e22db4d-utilities\") pod \"community-operators-p6qtb\" (UID: \"ebbc8197-2f60-4876-8bab-ae450e22db4d\") " pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.374700 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebbc8197-2f60-4876-8bab-ae450e22db4d-catalog-content\") pod \"community-operators-p6qtb\" (UID: \"ebbc8197-2f60-4876-8bab-ae450e22db4d\") " pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.374724 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp64z\" (UniqueName: \"kubernetes.io/projected/ebbc8197-2f60-4876-8bab-ae450e22db4d-kube-api-access-sp64z\") pod \"community-operators-p6qtb\" (UID: \"ebbc8197-2f60-4876-8bab-ae450e22db4d\") " pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.375172 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebbc8197-2f60-4876-8bab-ae450e22db4d-utilities\") pod \"community-operators-p6qtb\" (UID: \"ebbc8197-2f60-4876-8bab-ae450e22db4d\") " pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.376540 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" event={"ID":"55649f1c-678e-4e03-be55-7c4435446199","Type":"ContainerStarted","Data":"3aba8663044e53082f5c5ada44ee2551b345d644484002e743b9c11e8ed389b4"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.378745 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebbc8197-2f60-4876-8bab-ae450e22db4d-catalog-content\") pod \"community-operators-p6qtb\" (UID: \"ebbc8197-2f60-4876-8bab-ae450e22db4d\") " pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.383414 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" podUID="55649f1c-678e-4e03-be55-7c4435446199" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.389370 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm" event={"ID":"b2c881d7-03db-4608-a3f4-9a9ad8b2f5da","Type":"ContainerStarted","Data":"4fe0b9552e8c23264df44d7e5a4c8edc527f22f45b2450d1264573009da99ed7"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.396152 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" event={"ID":"fe107e39-b5ec-473d-8851-b57775dadafc","Type":"ContainerStarted","Data":"d096a2ddd7df99ea480cf5b9762fa3e7f74cdb1bf44c91ac699c2a25bb91a1eb"} Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.398003 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" podUID="fe107e39-b5ec-473d-8851-b57775dadafc" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.402421 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp64z\" (UniqueName: \"kubernetes.io/projected/ebbc8197-2f60-4876-8bab-ae450e22db4d-kube-api-access-sp64z\") pod \"community-operators-p6qtb\" (UID: \"ebbc8197-2f60-4876-8bab-ae450e22db4d\") " pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.436346 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.580379 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.580444 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.581045 4837 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.581101 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs podName:eaf3fa29-f441-43df-9fbe-409d9d8ad871 nodeName:}" failed. No retries permitted until 2026-03-13 12:04:04.581085349 +0000 UTC m=+960.219352112 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs") pod "openstack-operator-controller-manager-55876d85bb-96mp7" (UID: "eaf3fa29-f441-43df-9fbe-409d9d8ad871") : secret "webhook-server-cert" not found Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.584028 4837 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.584113 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs podName:eaf3fa29-f441-43df-9fbe-409d9d8ad871 nodeName:}" failed. No retries permitted until 2026-03-13 12:04:04.584089614 +0000 UTC m=+960.222356377 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs") pod "openstack-operator-controller-manager-55876d85bb-96mp7" (UID: "eaf3fa29-f441-43df-9fbe-409d9d8ad871") : secret "metrics-server-cert" not found Mar 13 12:04:03 crc kubenswrapper[4837]: I0313 12:04:03.024441 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p6qtb"] Mar 13 12:04:03 crc kubenswrapper[4837]: W0313 12:04:03.052709 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebbc8197_2f60_4876_8bab_ae450e22db4d.slice/crio-9a83d8f4cff27c5d1da1eb8b95de8aad51c44fbb135c4b97c55466208565344c WatchSource:0}: Error finding container 9a83d8f4cff27c5d1da1eb8b95de8aad51c44fbb135c4b97c55466208565344c: Status 404 returned error can't find the container with id 9a83d8f4cff27c5d1da1eb8b95de8aad51c44fbb135c4b97c55466208565344c Mar 13 12:04:03 crc kubenswrapper[4837]: I0313 12:04:03.410212 4837 generic.go:334] "Generic (PLEG): container finished" podID="ebbc8197-2f60-4876-8bab-ae450e22db4d" containerID="0ddcb245a681a303f4445dab08f3327e3df698349bd5573d79829fcc09b9c9ef" exitCode=0 Mar 13 12:04:03 crc kubenswrapper[4837]: I0313 12:04:03.411379 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6qtb" event={"ID":"ebbc8197-2f60-4876-8bab-ae450e22db4d","Type":"ContainerDied","Data":"0ddcb245a681a303f4445dab08f3327e3df698349bd5573d79829fcc09b9c9ef"} Mar 13 12:04:03 crc kubenswrapper[4837]: I0313 12:04:03.411411 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6qtb" event={"ID":"ebbc8197-2f60-4876-8bab-ae450e22db4d","Type":"ContainerStarted","Data":"9a83d8f4cff27c5d1da1eb8b95de8aad51c44fbb135c4b97c55466208565344c"} Mar 13 12:04:03 crc kubenswrapper[4837]: E0313 12:04:03.412583 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" podUID="5ef20b1d-5c03-4993-b635-b031ddcab3bf" Mar 13 12:04:03 crc kubenswrapper[4837]: E0313 12:04:03.414107 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" podUID="5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d" Mar 13 12:04:03 crc kubenswrapper[4837]: E0313 12:04:03.414170 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" podUID="cb20db22-bd0e-4897-8ed6-a6a80a91ffff" Mar 13 12:04:03 crc kubenswrapper[4837]: E0313 12:04:03.414239 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" podUID="fe107e39-b5ec-473d-8851-b57775dadafc" Mar 13 12:04:03 crc kubenswrapper[4837]: E0313 12:04:03.414543 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" podUID="35a21ab1-95b5-446a-ae10-d004e5aa2995" Mar 13 12:04:03 crc kubenswrapper[4837]: E0313 12:04:03.417138 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" podUID="55649f1c-678e-4e03-be55-7c4435446199" Mar 13 12:04:04 crc kubenswrapper[4837]: I0313 12:04:04.011459 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert\") pod \"infra-operator-controller-manager-5995f4446f-fhlk9\" (UID: \"c19c3466-ab50-4be3-8299-d7b8b3d263df\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:04 crc kubenswrapper[4837]: E0313 12:04:04.013225 4837 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 12:04:04 crc kubenswrapper[4837]: E0313 12:04:04.013436 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert podName:c19c3466-ab50-4be3-8299-d7b8b3d263df nodeName:}" failed. No retries permitted until 2026-03-13 12:04:08.01341905 +0000 UTC m=+963.651685813 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert") pod "infra-operator-controller-manager-5995f4446f-fhlk9" (UID: "c19c3466-ab50-4be3-8299-d7b8b3d263df") : secret "infra-operator-webhook-server-cert" not found Mar 13 12:04:04 crc kubenswrapper[4837]: I0313 12:04:04.217477 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b77x9vc\" (UID: \"7b38159c-e030-4734-963d-dfc38d29c75c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:04 crc kubenswrapper[4837]: E0313 12:04:04.217663 4837 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:04:04 crc kubenswrapper[4837]: E0313 12:04:04.217852 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert podName:7b38159c-e030-4734-963d-dfc38d29c75c nodeName:}" failed. No retries permitted until 2026-03-13 12:04:08.217830425 +0000 UTC m=+963.856097188 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" (UID: "7b38159c-e030-4734-963d-dfc38d29c75c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:04:04 crc kubenswrapper[4837]: I0313 12:04:04.421114 4837 generic.go:334] "Generic (PLEG): container finished" podID="8bda3181-d107-4de8-b754-e5e67dd8dd9c" containerID="945088ee0e42cd72cf70828366cf9ffb988a0eebcb4e0d5222d7e3f1439eeef4" exitCode=0 Mar 13 12:04:04 crc kubenswrapper[4837]: I0313 12:04:04.421175 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556724-st6gn" event={"ID":"8bda3181-d107-4de8-b754-e5e67dd8dd9c","Type":"ContainerDied","Data":"945088ee0e42cd72cf70828366cf9ffb988a0eebcb4e0d5222d7e3f1439eeef4"} Mar 13 12:04:04 crc kubenswrapper[4837]: I0313 12:04:04.625376 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:04 crc kubenswrapper[4837]: I0313 12:04:04.625457 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:04 crc kubenswrapper[4837]: E0313 12:04:04.625599 4837 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 12:04:04 crc kubenswrapper[4837]: E0313 12:04:04.625690 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs podName:eaf3fa29-f441-43df-9fbe-409d9d8ad871 nodeName:}" failed. No retries permitted until 2026-03-13 12:04:08.625670263 +0000 UTC m=+964.263937026 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs") pod "openstack-operator-controller-manager-55876d85bb-96mp7" (UID: "eaf3fa29-f441-43df-9fbe-409d9d8ad871") : secret "webhook-server-cert" not found Mar 13 12:04:04 crc kubenswrapper[4837]: E0313 12:04:04.625731 4837 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 12:04:04 crc kubenswrapper[4837]: E0313 12:04:04.625835 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs podName:eaf3fa29-f441-43df-9fbe-409d9d8ad871 nodeName:}" failed. No retries permitted until 2026-03-13 12:04:08.625816358 +0000 UTC m=+964.264083121 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs") pod "openstack-operator-controller-manager-55876d85bb-96mp7" (UID: "eaf3fa29-f441-43df-9fbe-409d9d8ad871") : secret "metrics-server-cert" not found Mar 13 12:04:05 crc kubenswrapper[4837]: I0313 12:04:05.483446 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:04:05 crc kubenswrapper[4837]: I0313 12:04:05.483881 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:04:08 crc kubenswrapper[4837]: I0313 12:04:08.077606 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert\") pod \"infra-operator-controller-manager-5995f4446f-fhlk9\" (UID: \"c19c3466-ab50-4be3-8299-d7b8b3d263df\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:08 crc kubenswrapper[4837]: E0313 12:04:08.077786 4837 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 12:04:08 crc kubenswrapper[4837]: E0313 12:04:08.077971 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert podName:c19c3466-ab50-4be3-8299-d7b8b3d263df nodeName:}" failed. No retries permitted until 2026-03-13 12:04:16.077953957 +0000 UTC m=+971.716220720 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert") pod "infra-operator-controller-manager-5995f4446f-fhlk9" (UID: "c19c3466-ab50-4be3-8299-d7b8b3d263df") : secret "infra-operator-webhook-server-cert" not found Mar 13 12:04:08 crc kubenswrapper[4837]: I0313 12:04:08.288601 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b77x9vc\" (UID: \"7b38159c-e030-4734-963d-dfc38d29c75c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:08 crc kubenswrapper[4837]: E0313 12:04:08.288865 4837 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:04:08 crc kubenswrapper[4837]: E0313 12:04:08.288972 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert podName:7b38159c-e030-4734-963d-dfc38d29c75c nodeName:}" failed. No retries permitted until 2026-03-13 12:04:16.2889535 +0000 UTC m=+971.927220263 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" (UID: "7b38159c-e030-4734-963d-dfc38d29c75c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:04:08 crc kubenswrapper[4837]: I0313 12:04:08.693529 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:08 crc kubenswrapper[4837]: I0313 12:04:08.693605 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:08 crc kubenswrapper[4837]: E0313 12:04:08.693740 4837 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 12:04:08 crc kubenswrapper[4837]: E0313 12:04:08.693765 4837 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 12:04:08 crc kubenswrapper[4837]: E0313 12:04:08.693838 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs podName:eaf3fa29-f441-43df-9fbe-409d9d8ad871 nodeName:}" failed. No retries permitted until 2026-03-13 12:04:16.693817515 +0000 UTC m=+972.332084278 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs") pod "openstack-operator-controller-manager-55876d85bb-96mp7" (UID: "eaf3fa29-f441-43df-9fbe-409d9d8ad871") : secret "webhook-server-cert" not found Mar 13 12:04:08 crc kubenswrapper[4837]: E0313 12:04:08.693859 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs podName:eaf3fa29-f441-43df-9fbe-409d9d8ad871 nodeName:}" failed. No retries permitted until 2026-03-13 12:04:16.693850936 +0000 UTC m=+972.332117769 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs") pod "openstack-operator-controller-manager-55876d85bb-96mp7" (UID: "eaf3fa29-f441-43df-9fbe-409d9d8ad871") : secret "metrics-server-cert" not found Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.015511 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-84c45"] Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.020129 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.022512 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-84c45"] Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.101669 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-utilities\") pod \"certified-operators-84c45\" (UID: \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\") " pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.101745 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kfl9\" (UniqueName: \"kubernetes.io/projected/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-kube-api-access-8kfl9\") pod \"certified-operators-84c45\" (UID: \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\") " pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.102003 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-catalog-content\") pod \"certified-operators-84c45\" (UID: \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\") " pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.202588 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-catalog-content\") pod \"certified-operators-84c45\" (UID: \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\") " pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.202956 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-utilities\") pod \"certified-operators-84c45\" (UID: \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\") " pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.202996 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kfl9\" (UniqueName: \"kubernetes.io/projected/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-kube-api-access-8kfl9\") pod \"certified-operators-84c45\" (UID: \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\") " pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.203169 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-catalog-content\") pod \"certified-operators-84c45\" (UID: \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\") " pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.203683 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-utilities\") pod \"certified-operators-84c45\" (UID: \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\") " pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.226670 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kfl9\" (UniqueName: \"kubernetes.io/projected/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-kube-api-access-8kfl9\") pod \"certified-operators-84c45\" (UID: \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\") " pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.345368 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:11 crc kubenswrapper[4837]: I0313 12:04:11.600486 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556724-st6gn" Mar 13 12:04:11 crc kubenswrapper[4837]: I0313 12:04:11.743107 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c85bh\" (UniqueName: \"kubernetes.io/projected/8bda3181-d107-4de8-b754-e5e67dd8dd9c-kube-api-access-c85bh\") pod \"8bda3181-d107-4de8-b754-e5e67dd8dd9c\" (UID: \"8bda3181-d107-4de8-b754-e5e67dd8dd9c\") " Mar 13 12:04:11 crc kubenswrapper[4837]: I0313 12:04:11.746796 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bda3181-d107-4de8-b754-e5e67dd8dd9c-kube-api-access-c85bh" (OuterVolumeSpecName: "kube-api-access-c85bh") pod "8bda3181-d107-4de8-b754-e5e67dd8dd9c" (UID: "8bda3181-d107-4de8-b754-e5e67dd8dd9c"). InnerVolumeSpecName "kube-api-access-c85bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:04:11 crc kubenswrapper[4837]: I0313 12:04:11.844241 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c85bh\" (UniqueName: \"kubernetes.io/projected/8bda3181-d107-4de8-b754-e5e67dd8dd9c-kube-api-access-c85bh\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:12 crc kubenswrapper[4837]: I0313 12:04:12.473620 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556724-st6gn" event={"ID":"8bda3181-d107-4de8-b754-e5e67dd8dd9c","Type":"ContainerDied","Data":"c507ea9ace1bdacbbcb4871524163d9cbeadd2ea422f02492bfe1b29dd12a90a"} Mar 13 12:04:12 crc kubenswrapper[4837]: I0313 12:04:12.473678 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c507ea9ace1bdacbbcb4871524163d9cbeadd2ea422f02492bfe1b29dd12a90a" Mar 13 12:04:12 crc kubenswrapper[4837]: I0313 12:04:12.473727 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556724-st6gn" Mar 13 12:04:12 crc kubenswrapper[4837]: I0313 12:04:12.666989 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556718-7z6qj"] Mar 13 12:04:12 crc kubenswrapper[4837]: I0313 12:04:12.671964 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556718-7z6qj"] Mar 13 12:04:13 crc kubenswrapper[4837]: I0313 12:04:13.056444 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa01e7a4-71d3-4c91-8319-52a575269601" path="/var/lib/kubelet/pods/aa01e7a4-71d3-4c91-8319-52a575269601/volumes" Mar 13 12:04:14 crc kubenswrapper[4837]: I0313 12:04:14.967352 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-84c45"] Mar 13 12:04:14 crc kubenswrapper[4837]: W0313 12:04:14.994756 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b5b628a_9d8f_4ce7_b023_adbb2b00ff47.slice/crio-975dad2eacabc4e891be9a4559895099caf82984d20099b8ca5021f0401addb6 WatchSource:0}: Error finding container 975dad2eacabc4e891be9a4559895099caf82984d20099b8ca5021f0401addb6: Status 404 returned error can't find the container with id 975dad2eacabc4e891be9a4559895099caf82984d20099b8ca5021f0401addb6 Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.501238 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx" event={"ID":"e645f00a-8463-4fac-b010-f0500b54d68a","Type":"ContainerStarted","Data":"c0c941079793ea3e2294c1a3ff92e74ae0f005d09a1f62fc3a195290cce0093b"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.502718 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm" event={"ID":"b2c881d7-03db-4608-a3f4-9a9ad8b2f5da","Type":"ContainerStarted","Data":"99018576e16e6c14b998e51892e2371fa278a42e6a7c1a766fa800f2545d0556"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.502854 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.504101 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7" event={"ID":"fa1b1ba2-3856-49cb-bda4-8ac5e63b5298","Type":"ContainerStarted","Data":"a981be69ac171a461fb49a754ee45225c0d4814cb8cee38ab244eb7bb64fed80"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.504179 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.511463 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkk4z" event={"ID":"ce0c89e1-3fc0-473d-875f-461c8b423061","Type":"ContainerStarted","Data":"a9df8093717a2ad9e21bfeae5dc8f64f13d93b7992d61993d6c5fbbf1e5d1a53"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.523550 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6" event={"ID":"9bd066a9-3999-405a-b619-540678a46ded","Type":"ContainerStarted","Data":"7c848b83431cf06b7a899c3a04dc2546822780053219e04b63dec7707f4e69d3"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.523716 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.526196 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd" event={"ID":"561aed86-f289-4dd1-8c53-307ccdc99165","Type":"ContainerStarted","Data":"d26b09a496068c126b5dd06e2c21171667180c92f655372f30818b492936ef3c"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.526846 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.530934 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l" event={"ID":"3059d7c0-2624-4d3e-af0f-de054401f1ec","Type":"ContainerStarted","Data":"dbda97fc4d7bb724557e5593f08fd816794b762a8cefab05bc3dc86500e3e7f2"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.531035 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.538185 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9" event={"ID":"1870e3ae-40fd-479c-9aa7-9ce3a3e2dd2e","Type":"ContainerStarted","Data":"9f7b0a3f4094dd090d148a3a68dbef241db6f18aca6d8bcdd79501087bfa8e48"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.538471 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.543909 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx" podStartSLOduration=3.229906324 podStartE2EDuration="16.543893626s" podCreationTimestamp="2026-03-13 12:03:59 +0000 UTC" firstStartedPulling="2026-03-13 12:04:01.419009329 +0000 UTC m=+957.057276092" lastFinishedPulling="2026-03-13 12:04:14.732996631 +0000 UTC m=+970.371263394" observedRunningTime="2026-03-13 12:04:15.541666436 +0000 UTC m=+971.179933199" watchObservedRunningTime="2026-03-13 12:04:15.543893626 +0000 UTC m=+971.182160389" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.547163 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84c45" event={"ID":"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47","Type":"ContainerStarted","Data":"d879a84e3b9f99d7a40acbf499a69feddf5c1973d117fb58100dc525ee864d7d"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.547209 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84c45" event={"ID":"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47","Type":"ContainerStarted","Data":"975dad2eacabc4e891be9a4559895099caf82984d20099b8ca5021f0401addb6"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.563312 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7" event={"ID":"11a29883-0638-4da4-a1dc-bf2127a3645c","Type":"ContainerStarted","Data":"8fe6623e92501c7039280f383c00c64769ab2163095c7634b3a67dbbde9d8baa"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.563955 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.580388 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l" podStartSLOduration=3.023606036 podStartE2EDuration="15.580365792s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:02.21429473 +0000 UTC m=+957.852561493" lastFinishedPulling="2026-03-13 12:04:14.771054486 +0000 UTC m=+970.409321249" observedRunningTime="2026-03-13 12:04:15.571601944 +0000 UTC m=+971.209868697" watchObservedRunningTime="2026-03-13 12:04:15.580365792 +0000 UTC m=+971.218632565" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.585948 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf" event={"ID":"89e6d6f8-7bd3-4862-b41c-cd5c1f05f3e5","Type":"ContainerStarted","Data":"a5b6c5bc653b71ef848edfcf85e66d28959963c70cfcfdf4b0090e115f21466e"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.586803 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.606611 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7" podStartSLOduration=2.759714026 podStartE2EDuration="15.606594122s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:01.843960609 +0000 UTC m=+957.482227372" lastFinishedPulling="2026-03-13 12:04:14.690840715 +0000 UTC m=+970.329107468" observedRunningTime="2026-03-13 12:04:15.601030626 +0000 UTC m=+971.239297389" watchObservedRunningTime="2026-03-13 12:04:15.606594122 +0000 UTC m=+971.244860885" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.618053 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95" event={"ID":"046bdee0-f0cf-4d17-916b-68d301502473","Type":"ContainerStarted","Data":"e79795df944d63a92a34cf26e3d6864ae6401490c7cee14a7c7f905c4b2dfdb6"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.618861 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.633768 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq" event={"ID":"1d59bb7f-598d-4c70-9b8c-ce4e3048691f","Type":"ContainerStarted","Data":"76bb8a1a95a22170f96278f0e14b759c03de32bf8150a13d7828cdf16602339b"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.634467 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.653394 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9" podStartSLOduration=3.280086905 podStartE2EDuration="16.653362334s" podCreationTimestamp="2026-03-13 12:03:59 +0000 UTC" firstStartedPulling="2026-03-13 12:04:01.402650271 +0000 UTC m=+957.040917034" lastFinishedPulling="2026-03-13 12:04:14.7759257 +0000 UTC m=+970.414192463" observedRunningTime="2026-03-13 12:04:15.649348366 +0000 UTC m=+971.287615129" watchObservedRunningTime="2026-03-13 12:04:15.653362334 +0000 UTC m=+971.291629097" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.656045 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6qtb" event={"ID":"ebbc8197-2f60-4876-8bab-ae450e22db4d","Type":"ContainerStarted","Data":"8bba29e2f9fe402d7901b43545abaa3e8580bab8bbce4d30cd8eeaffafb977c7"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.672406 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7" event={"ID":"ee1c592d-7979-4b75-b8e4-7ccd6d7d6048","Type":"ContainerStarted","Data":"025c8bb60386947143b36cf09cd303596ca8f1daa7b656f7f24c96249980901d"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.673897 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.699738 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z" event={"ID":"0a24601d-8e41-4f99-9e33-870d791a3e7e","Type":"ContainerStarted","Data":"0087edc6acb16aab7cac54b3862d534769dba2c9621d81325a105b830409440e"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.700148 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.717597 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6" podStartSLOduration=2.721752415 podStartE2EDuration="15.717565438s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:01.793212383 +0000 UTC m=+957.431479146" lastFinishedPulling="2026-03-13 12:04:14.789025406 +0000 UTC m=+970.427292169" observedRunningTime="2026-03-13 12:04:15.703244954 +0000 UTC m=+971.341511717" watchObservedRunningTime="2026-03-13 12:04:15.717565438 +0000 UTC m=+971.355832201" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.718737 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm" podStartSLOduration=3.775047544 podStartE2EDuration="16.718730274s" podCreationTimestamp="2026-03-13 12:03:59 +0000 UTC" firstStartedPulling="2026-03-13 12:04:01.792546862 +0000 UTC m=+957.430813625" lastFinishedPulling="2026-03-13 12:04:14.736229592 +0000 UTC m=+970.374496355" observedRunningTime="2026-03-13 12:04:15.681152834 +0000 UTC m=+971.319419597" watchObservedRunningTime="2026-03-13 12:04:15.718730274 +0000 UTC m=+971.356997037" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.735094 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd" podStartSLOduration=3.269176655 podStartE2EDuration="15.735072752s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:01.867164165 +0000 UTC m=+957.505430938" lastFinishedPulling="2026-03-13 12:04:14.333060272 +0000 UTC m=+969.971327035" observedRunningTime="2026-03-13 12:04:15.731087966 +0000 UTC m=+971.369354719" watchObservedRunningTime="2026-03-13 12:04:15.735072752 +0000 UTC m=+971.373339515" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.781975 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkk4z" podStartSLOduration=3.192470765 podStartE2EDuration="15.781951367s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:02.211043477 +0000 UTC m=+957.849310240" lastFinishedPulling="2026-03-13 12:04:14.800524039 +0000 UTC m=+970.438790842" observedRunningTime="2026-03-13 12:04:15.777881918 +0000 UTC m=+971.416148681" watchObservedRunningTime="2026-03-13 12:04:15.781951367 +0000 UTC m=+971.420218130" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.826405 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z" podStartSLOduration=3.8889076989999998 podStartE2EDuration="16.826383044s" podCreationTimestamp="2026-03-13 12:03:59 +0000 UTC" firstStartedPulling="2026-03-13 12:04:01.40828686 +0000 UTC m=+957.046553623" lastFinishedPulling="2026-03-13 12:04:14.345762205 +0000 UTC m=+969.984028968" observedRunningTime="2026-03-13 12:04:15.82307941 +0000 UTC m=+971.461346173" watchObservedRunningTime="2026-03-13 12:04:15.826383044 +0000 UTC m=+971.464649807" Mar 13 12:04:15 crc kubenswrapper[4837]: E0313 12:04:15.876887 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebbc8197_2f60_4876_8bab_ae450e22db4d.slice/crio-conmon-8bba29e2f9fe402d7901b43545abaa3e8580bab8bbce4d30cd8eeaffafb977c7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebbc8197_2f60_4876_8bab_ae450e22db4d.slice/crio-8bba29e2f9fe402d7901b43545abaa3e8580bab8bbce4d30cd8eeaffafb977c7.scope\": RecentStats: unable to find data in memory cache]" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.896795 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95" podStartSLOduration=3.838338234 podStartE2EDuration="15.896768614s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:01.881774338 +0000 UTC m=+957.520041101" lastFinishedPulling="2026-03-13 12:04:13.940204718 +0000 UTC m=+969.578471481" observedRunningTime="2026-03-13 12:04:15.881594093 +0000 UTC m=+971.519860866" watchObservedRunningTime="2026-03-13 12:04:15.896768614 +0000 UTC m=+971.535035387" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.943740 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7" podStartSLOduration=2.946334107 podStartE2EDuration="15.943721881s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:01.867180495 +0000 UTC m=+957.505447258" lastFinishedPulling="2026-03-13 12:04:14.864568269 +0000 UTC m=+970.502835032" observedRunningTime="2026-03-13 12:04:15.941454359 +0000 UTC m=+971.579721122" watchObservedRunningTime="2026-03-13 12:04:15.943721881 +0000 UTC m=+971.581988644" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.977043 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7" podStartSLOduration=3.4862075900000002 podStartE2EDuration="15.977020126s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:01.854971259 +0000 UTC m=+957.493238022" lastFinishedPulling="2026-03-13 12:04:14.345783795 +0000 UTC m=+969.984050558" observedRunningTime="2026-03-13 12:04:15.974196096 +0000 UTC m=+971.612462849" watchObservedRunningTime="2026-03-13 12:04:15.977020126 +0000 UTC m=+971.615286889" Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.039932 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf" podStartSLOduration=3.168137805 podStartE2EDuration="16.039909898s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:01.818499214 +0000 UTC m=+957.456765987" lastFinishedPulling="2026-03-13 12:04:14.690271317 +0000 UTC m=+970.328538080" observedRunningTime="2026-03-13 12:04:16.004495106 +0000 UTC m=+971.642761859" watchObservedRunningTime="2026-03-13 12:04:16.039909898 +0000 UTC m=+971.678176651" Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.044626 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq" podStartSLOduration=3.914331675 podStartE2EDuration="17.044615677s" podCreationTimestamp="2026-03-13 12:03:59 +0000 UTC" firstStartedPulling="2026-03-13 12:04:00.809977828 +0000 UTC m=+956.448244591" lastFinishedPulling="2026-03-13 12:04:13.94026183 +0000 UTC m=+969.578528593" observedRunningTime="2026-03-13 12:04:16.037879113 +0000 UTC m=+971.676145876" watchObservedRunningTime="2026-03-13 12:04:16.044615677 +0000 UTC m=+971.682882440" Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.129740 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert\") pod \"infra-operator-controller-manager-5995f4446f-fhlk9\" (UID: \"c19c3466-ab50-4be3-8299-d7b8b3d263df\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:16 crc kubenswrapper[4837]: E0313 12:04:16.129924 4837 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 12:04:16 crc kubenswrapper[4837]: E0313 12:04:16.129999 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert podName:c19c3466-ab50-4be3-8299-d7b8b3d263df nodeName:}" failed. No retries permitted until 2026-03-13 12:04:32.129981072 +0000 UTC m=+987.768247825 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert") pod "infra-operator-controller-manager-5995f4446f-fhlk9" (UID: "c19c3466-ab50-4be3-8299-d7b8b3d263df") : secret "infra-operator-webhook-server-cert" not found Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.332033 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b77x9vc\" (UID: \"7b38159c-e030-4734-963d-dfc38d29c75c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:16 crc kubenswrapper[4837]: E0313 12:04:16.332216 4837 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:04:16 crc kubenswrapper[4837]: E0313 12:04:16.332293 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert podName:7b38159c-e030-4734-963d-dfc38d29c75c nodeName:}" failed. No retries permitted until 2026-03-13 12:04:32.332275929 +0000 UTC m=+987.970542692 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" (UID: "7b38159c-e030-4734-963d-dfc38d29c75c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.711721 4837 generic.go:334] "Generic (PLEG): container finished" podID="ebbc8197-2f60-4876-8bab-ae450e22db4d" containerID="8bba29e2f9fe402d7901b43545abaa3e8580bab8bbce4d30cd8eeaffafb977c7" exitCode=0 Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.711793 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6qtb" event={"ID":"ebbc8197-2f60-4876-8bab-ae450e22db4d","Type":"ContainerDied","Data":"8bba29e2f9fe402d7901b43545abaa3e8580bab8bbce4d30cd8eeaffafb977c7"} Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.713920 4837 generic.go:334] "Generic (PLEG): container finished" podID="5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" containerID="d879a84e3b9f99d7a40acbf499a69feddf5c1973d117fb58100dc525ee864d7d" exitCode=0 Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.714756 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84c45" event={"ID":"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47","Type":"ContainerDied","Data":"d879a84e3b9f99d7a40acbf499a69feddf5c1973d117fb58100dc525ee864d7d"} Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.716429 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx" Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.744015 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.744310 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.758427 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.760678 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.779112 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-x6jmh" Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.786257 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:17 crc kubenswrapper[4837]: I0313 12:04:17.113076 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7"] Mar 13 12:04:17 crc kubenswrapper[4837]: W0313 12:04:17.124512 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaf3fa29_f441_43df_9fbe_409d9d8ad871.slice/crio-532e6f022fc3bbb9047521c5b235eeb2ea9c2b1b404f498d41b1b67802ac6a23 WatchSource:0}: Error finding container 532e6f022fc3bbb9047521c5b235eeb2ea9c2b1b404f498d41b1b67802ac6a23: Status 404 returned error can't find the container with id 532e6f022fc3bbb9047521c5b235eeb2ea9c2b1b404f498d41b1b67802ac6a23 Mar 13 12:04:17 crc kubenswrapper[4837]: I0313 12:04:17.720762 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" event={"ID":"eaf3fa29-f441-43df-9fbe-409d9d8ad871","Type":"ContainerStarted","Data":"532e6f022fc3bbb9047521c5b235eeb2ea9c2b1b404f498d41b1b67802ac6a23"} Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.287138 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.309091 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.320156 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.341174 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.355788 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.439709 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.593063 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.612988 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.703482 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.724757 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.751053 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" event={"ID":"eaf3fa29-f441-43df-9fbe-409d9d8ad871","Type":"ContainerStarted","Data":"1c79a7f053c259d44cf3c176d2e5da8d9e16078db1e03ad08933021339b6e95c"} Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.751451 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.771300 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.787980 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" podStartSLOduration=20.787954556 podStartE2EDuration="20.787954556s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:04:20.78146066 +0000 UTC m=+976.419727423" watchObservedRunningTime="2026-03-13 12:04:20.787954556 +0000 UTC m=+976.426221389" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.801223 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.838198 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd" Mar 13 12:04:21 crc kubenswrapper[4837]: I0313 12:04:21.951906 4837 scope.go:117] "RemoveContainer" containerID="f165f764ee51b6b29672c3c9a0ac54376301b2d6f3ce983abfa09b63813909b9" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.788236 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84c45" event={"ID":"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47","Type":"ContainerStarted","Data":"400082a25f5ff89be657587e2ae6b6c9469f186282f23d0b831a5031bd2d1dfb"} Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.790527 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6qtb" event={"ID":"ebbc8197-2f60-4876-8bab-ae450e22db4d","Type":"ContainerStarted","Data":"ae0fc7875b5423aa4eeca51206519575c5e9ed80070b4ef32750000a14fbd0df"} Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.792014 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" event={"ID":"35a21ab1-95b5-446a-ae10-d004e5aa2995","Type":"ContainerStarted","Data":"76522bc68aa812f69238e3872a3412d8bddf1196af88f9e904e2ed539ac6e32a"} Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.792220 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.793264 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" event={"ID":"5ef20b1d-5c03-4993-b635-b031ddcab3bf","Type":"ContainerStarted","Data":"e52f5bccfc9477dc64bf778fb2266d7bff7a50af8c2d85588e0a62e951ce03ac"} Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.793419 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.794576 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" event={"ID":"5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d","Type":"ContainerStarted","Data":"9f58e7a6ae2a49785e328ced3d15e0ed6a3770c9bd85ea7ee06f1d062a36a7d5"} Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.794798 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.795959 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" event={"ID":"55649f1c-678e-4e03-be55-7c4435446199","Type":"ContainerStarted","Data":"057230b019cd1eb295878f6b1772bae9b97d1696a39934a128cc60b0b7f78404"} Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.796167 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.797219 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" event={"ID":"cb20db22-bd0e-4897-8ed6-a6a80a91ffff","Type":"ContainerStarted","Data":"abd2c01a67167137e03c652ba82b4e7c80e5d272da396063ea4aee9b01b3b72a"} Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.797407 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.798411 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" event={"ID":"fe107e39-b5ec-473d-8851-b57775dadafc","Type":"ContainerStarted","Data":"9fad83d12a16d9e6168924f862324dea13d9e775d616db279f26dc4bd7d2e686"} Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.798574 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.825973 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" podStartSLOduration=2.696709791 podStartE2EDuration="24.825958062s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:02.242125402 +0000 UTC m=+957.880392165" lastFinishedPulling="2026-03-13 12:04:24.371373673 +0000 UTC m=+980.009640436" observedRunningTime="2026-03-13 12:04:24.824706793 +0000 UTC m=+980.462973566" watchObservedRunningTime="2026-03-13 12:04:24.825958062 +0000 UTC m=+980.464224825" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.850450 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" podStartSLOduration=4.150153501 podStartE2EDuration="24.850433918s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:02.217956547 +0000 UTC m=+957.856223310" lastFinishedPulling="2026-03-13 12:04:22.918236954 +0000 UTC m=+978.556503727" observedRunningTime="2026-03-13 12:04:24.845366157 +0000 UTC m=+980.483632920" watchObservedRunningTime="2026-03-13 12:04:24.850433918 +0000 UTC m=+980.488700671" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.867078 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" podStartSLOduration=2.769493108 podStartE2EDuration="24.867062775s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:02.231692422 +0000 UTC m=+957.869959185" lastFinishedPulling="2026-03-13 12:04:24.329262079 +0000 UTC m=+979.967528852" observedRunningTime="2026-03-13 12:04:24.864164543 +0000 UTC m=+980.502431306" watchObservedRunningTime="2026-03-13 12:04:24.867062775 +0000 UTC m=+980.505329538" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.887209 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" podStartSLOduration=2.800421825 podStartE2EDuration="24.887190201s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:02.24239122 +0000 UTC m=+957.880657983" lastFinishedPulling="2026-03-13 12:04:24.329159596 +0000 UTC m=+979.967426359" observedRunningTime="2026-03-13 12:04:24.886845751 +0000 UTC m=+980.525112514" watchObservedRunningTime="2026-03-13 12:04:24.887190201 +0000 UTC m=+980.525456964" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.911464 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" podStartSLOduration=2.810509696 podStartE2EDuration="24.91142769s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:02.231483855 +0000 UTC m=+957.869750618" lastFinishedPulling="2026-03-13 12:04:24.332401859 +0000 UTC m=+979.970668612" observedRunningTime="2026-03-13 12:04:24.909128607 +0000 UTC m=+980.547395380" watchObservedRunningTime="2026-03-13 12:04:24.91142769 +0000 UTC m=+980.549694453" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.977425 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" podStartSLOduration=3.401998082 podStartE2EDuration="24.977404219s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:02.228936744 +0000 UTC m=+957.867203507" lastFinishedPulling="2026-03-13 12:04:23.804342881 +0000 UTC m=+979.442609644" observedRunningTime="2026-03-13 12:04:24.943128703 +0000 UTC m=+980.581395466" watchObservedRunningTime="2026-03-13 12:04:24.977404219 +0000 UTC m=+980.615670982" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.979012 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p6qtb" podStartSLOduration=3.5822019210000002 podStartE2EDuration="22.97899888s" podCreationTimestamp="2026-03-13 12:04:02 +0000 UTC" firstStartedPulling="2026-03-13 12:04:04.924739956 +0000 UTC m=+960.563006719" lastFinishedPulling="2026-03-13 12:04:24.321536915 +0000 UTC m=+979.959803678" observedRunningTime="2026-03-13 12:04:24.976493451 +0000 UTC m=+980.614760234" watchObservedRunningTime="2026-03-13 12:04:24.97899888 +0000 UTC m=+980.617265643" Mar 13 12:04:25 crc kubenswrapper[4837]: I0313 12:04:25.807152 4837 generic.go:334] "Generic (PLEG): container finished" podID="5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" containerID="400082a25f5ff89be657587e2ae6b6c9469f186282f23d0b831a5031bd2d1dfb" exitCode=0 Mar 13 12:04:25 crc kubenswrapper[4837]: I0313 12:04:25.807855 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84c45" event={"ID":"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47","Type":"ContainerDied","Data":"400082a25f5ff89be657587e2ae6b6c9469f186282f23d0b831a5031bd2d1dfb"} Mar 13 12:04:26 crc kubenswrapper[4837]: I0313 12:04:26.793257 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:26 crc kubenswrapper[4837]: I0313 12:04:26.818535 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84c45" event={"ID":"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47","Type":"ContainerStarted","Data":"e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2"} Mar 13 12:04:26 crc kubenswrapper[4837]: I0313 12:04:26.844483 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-84c45" podStartSLOduration=8.183218857 podStartE2EDuration="18.844459049s" podCreationTimestamp="2026-03-13 12:04:08 +0000 UTC" firstStartedPulling="2026-03-13 12:04:15.549173884 +0000 UTC m=+971.187440647" lastFinishedPulling="2026-03-13 12:04:26.210414076 +0000 UTC m=+981.848680839" observedRunningTime="2026-03-13 12:04:26.838278024 +0000 UTC m=+982.476544787" watchObservedRunningTime="2026-03-13 12:04:26.844459049 +0000 UTC m=+982.482725812" Mar 13 12:04:29 crc kubenswrapper[4837]: I0313 12:04:29.346098 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:29 crc kubenswrapper[4837]: I0313 12:04:29.346160 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:29 crc kubenswrapper[4837]: I0313 12:04:29.394859 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:30 crc kubenswrapper[4837]: I0313 12:04:30.898538 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" Mar 13 12:04:30 crc kubenswrapper[4837]: I0313 12:04:30.910369 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" Mar 13 12:04:30 crc kubenswrapper[4837]: I0313 12:04:30.944071 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" Mar 13 12:04:30 crc kubenswrapper[4837]: I0313 12:04:30.978252 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" Mar 13 12:04:31 crc kubenswrapper[4837]: I0313 12:04:31.076988 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" Mar 13 12:04:31 crc kubenswrapper[4837]: I0313 12:04:31.114410 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.180008 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert\") pod \"infra-operator-controller-manager-5995f4446f-fhlk9\" (UID: \"c19c3466-ab50-4be3-8299-d7b8b3d263df\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.187317 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert\") pod \"infra-operator-controller-manager-5995f4446f-fhlk9\" (UID: \"c19c3466-ab50-4be3-8299-d7b8b3d263df\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.360566 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-dzbzz" Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.369160 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.383474 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b77x9vc\" (UID: \"7b38159c-e030-4734-963d-dfc38d29c75c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.387927 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b77x9vc\" (UID: \"7b38159c-e030-4734-963d-dfc38d29c75c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.436708 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.436766 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.515994 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.641480 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-qzn2w" Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.650335 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:32 crc kubenswrapper[4837]: W0313 12:04:32.829463 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc19c3466_ab50_4be3_8299_d7b8b3d263df.slice/crio-ad6d1202460b82cfc1e3d7bfcdb8137ca76d8f4e67f2ebf0c7488eb495a97a5a WatchSource:0}: Error finding container ad6d1202460b82cfc1e3d7bfcdb8137ca76d8f4e67f2ebf0c7488eb495a97a5a: Status 404 returned error can't find the container with id ad6d1202460b82cfc1e3d7bfcdb8137ca76d8f4e67f2ebf0c7488eb495a97a5a Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.830745 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9"] Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.868132 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc"] Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.868438 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" event={"ID":"c19c3466-ab50-4be3-8299-d7b8b3d263df","Type":"ContainerStarted","Data":"ad6d1202460b82cfc1e3d7bfcdb8137ca76d8f4e67f2ebf0c7488eb495a97a5a"} Mar 13 12:04:32 crc kubenswrapper[4837]: W0313 12:04:32.874807 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b38159c_e030_4734_963d_dfc38d29c75c.slice/crio-6257d4c927203d267ef19e259a32affb128d05fca949bd662f5e1462b6e46a77 WatchSource:0}: Error finding container 6257d4c927203d267ef19e259a32affb128d05fca949bd662f5e1462b6e46a77: Status 404 returned error can't find the container with id 6257d4c927203d267ef19e259a32affb128d05fca949bd662f5e1462b6e46a77 Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.905480 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:33 crc kubenswrapper[4837]: I0313 12:04:33.316085 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p6qtb"] Mar 13 12:04:33 crc kubenswrapper[4837]: I0313 12:04:33.877380 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" event={"ID":"7b38159c-e030-4734-963d-dfc38d29c75c","Type":"ContainerStarted","Data":"6257d4c927203d267ef19e259a32affb128d05fca949bd662f5e1462b6e46a77"} Mar 13 12:04:34 crc kubenswrapper[4837]: I0313 12:04:34.883473 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p6qtb" podUID="ebbc8197-2f60-4876-8bab-ae450e22db4d" containerName="registry-server" containerID="cri-o://ae0fc7875b5423aa4eeca51206519575c5e9ed80070b4ef32750000a14fbd0df" gracePeriod=2 Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.483372 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.483594 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.720491 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-95c7q"] Mar 13 12:04:35 crc kubenswrapper[4837]: E0313 12:04:35.721344 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bda3181-d107-4de8-b754-e5e67dd8dd9c" containerName="oc" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.721371 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bda3181-d107-4de8-b754-e5e67dd8dd9c" containerName="oc" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.721538 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bda3181-d107-4de8-b754-e5e67dd8dd9c" containerName="oc" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.722700 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.727344 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-95c7q"] Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.739188 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecbc61c9-ab9c-485e-9112-bb8704851de8-catalog-content\") pod \"redhat-operators-95c7q\" (UID: \"ecbc61c9-ab9c-485e-9112-bb8704851de8\") " pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.739253 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecbc61c9-ab9c-485e-9112-bb8704851de8-utilities\") pod \"redhat-operators-95c7q\" (UID: \"ecbc61c9-ab9c-485e-9112-bb8704851de8\") " pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.739283 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvb2c\" (UniqueName: \"kubernetes.io/projected/ecbc61c9-ab9c-485e-9112-bb8704851de8-kube-api-access-vvb2c\") pod \"redhat-operators-95c7q\" (UID: \"ecbc61c9-ab9c-485e-9112-bb8704851de8\") " pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.840356 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecbc61c9-ab9c-485e-9112-bb8704851de8-catalog-content\") pod \"redhat-operators-95c7q\" (UID: \"ecbc61c9-ab9c-485e-9112-bb8704851de8\") " pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.840400 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecbc61c9-ab9c-485e-9112-bb8704851de8-utilities\") pod \"redhat-operators-95c7q\" (UID: \"ecbc61c9-ab9c-485e-9112-bb8704851de8\") " pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.840421 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvb2c\" (UniqueName: \"kubernetes.io/projected/ecbc61c9-ab9c-485e-9112-bb8704851de8-kube-api-access-vvb2c\") pod \"redhat-operators-95c7q\" (UID: \"ecbc61c9-ab9c-485e-9112-bb8704851de8\") " pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.841148 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecbc61c9-ab9c-485e-9112-bb8704851de8-catalog-content\") pod \"redhat-operators-95c7q\" (UID: \"ecbc61c9-ab9c-485e-9112-bb8704851de8\") " pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.841246 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecbc61c9-ab9c-485e-9112-bb8704851de8-utilities\") pod \"redhat-operators-95c7q\" (UID: \"ecbc61c9-ab9c-485e-9112-bb8704851de8\") " pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.864483 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvb2c\" (UniqueName: \"kubernetes.io/projected/ecbc61c9-ab9c-485e-9112-bb8704851de8-kube-api-access-vvb2c\") pod \"redhat-operators-95c7q\" (UID: \"ecbc61c9-ab9c-485e-9112-bb8704851de8\") " pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.896308 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" event={"ID":"7b38159c-e030-4734-963d-dfc38d29c75c","Type":"ContainerStarted","Data":"2ab86d7ebd8d40fa48f3ce7fd722f9b522ea7d214ba983eb06850bd968519f0d"} Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.897314 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.907189 4837 generic.go:334] "Generic (PLEG): container finished" podID="ebbc8197-2f60-4876-8bab-ae450e22db4d" containerID="ae0fc7875b5423aa4eeca51206519575c5e9ed80070b4ef32750000a14fbd0df" exitCode=0 Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.907251 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6qtb" event={"ID":"ebbc8197-2f60-4876-8bab-ae450e22db4d","Type":"ContainerDied","Data":"ae0fc7875b5423aa4eeca51206519575c5e9ed80070b4ef32750000a14fbd0df"} Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.908857 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" event={"ID":"c19c3466-ab50-4be3-8299-d7b8b3d263df","Type":"ContainerStarted","Data":"9a5e7c35318040a70b980cd86ad93a24d7c7c060feeff3e0ab9006cae0e2922d"} Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.909450 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.940524 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" podStartSLOduration=33.644429964 podStartE2EDuration="35.940503404s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:32.876807909 +0000 UTC m=+988.515074672" lastFinishedPulling="2026-03-13 12:04:35.172881349 +0000 UTC m=+990.811148112" observedRunningTime="2026-03-13 12:04:35.935279318 +0000 UTC m=+991.573546081" watchObservedRunningTime="2026-03-13 12:04:35.940503404 +0000 UTC m=+991.578770167" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.953427 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.965931 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" podStartSLOduration=33.62729279 podStartE2EDuration="35.965557567s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:32.839156166 +0000 UTC m=+988.477422969" lastFinishedPulling="2026-03-13 12:04:35.177420983 +0000 UTC m=+990.815687746" observedRunningTime="2026-03-13 12:04:35.955896552 +0000 UTC m=+991.594163315" watchObservedRunningTime="2026-03-13 12:04:35.965557567 +0000 UTC m=+991.603824340" Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.038985 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.042526 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp64z\" (UniqueName: \"kubernetes.io/projected/ebbc8197-2f60-4876-8bab-ae450e22db4d-kube-api-access-sp64z\") pod \"ebbc8197-2f60-4876-8bab-ae450e22db4d\" (UID: \"ebbc8197-2f60-4876-8bab-ae450e22db4d\") " Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.042613 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebbc8197-2f60-4876-8bab-ae450e22db4d-catalog-content\") pod \"ebbc8197-2f60-4876-8bab-ae450e22db4d\" (UID: \"ebbc8197-2f60-4876-8bab-ae450e22db4d\") " Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.042681 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebbc8197-2f60-4876-8bab-ae450e22db4d-utilities\") pod \"ebbc8197-2f60-4876-8bab-ae450e22db4d\" (UID: \"ebbc8197-2f60-4876-8bab-ae450e22db4d\") " Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.043968 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebbc8197-2f60-4876-8bab-ae450e22db4d-utilities" (OuterVolumeSpecName: "utilities") pod "ebbc8197-2f60-4876-8bab-ae450e22db4d" (UID: "ebbc8197-2f60-4876-8bab-ae450e22db4d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.049228 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebbc8197-2f60-4876-8bab-ae450e22db4d-kube-api-access-sp64z" (OuterVolumeSpecName: "kube-api-access-sp64z") pod "ebbc8197-2f60-4876-8bab-ae450e22db4d" (UID: "ebbc8197-2f60-4876-8bab-ae450e22db4d"). InnerVolumeSpecName "kube-api-access-sp64z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.109137 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebbc8197-2f60-4876-8bab-ae450e22db4d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebbc8197-2f60-4876-8bab-ae450e22db4d" (UID: "ebbc8197-2f60-4876-8bab-ae450e22db4d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.143716 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp64z\" (UniqueName: \"kubernetes.io/projected/ebbc8197-2f60-4876-8bab-ae450e22db4d-kube-api-access-sp64z\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.143746 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebbc8197-2f60-4876-8bab-ae450e22db4d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.143757 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebbc8197-2f60-4876-8bab-ae450e22db4d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.505954 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-95c7q"] Mar 13 12:04:36 crc kubenswrapper[4837]: W0313 12:04:36.509369 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecbc61c9_ab9c_485e_9112_bb8704851de8.slice/crio-d87a304801c0e9f7051845724164f67c6d9d6a0ecd1e18c769cc77332da22942 WatchSource:0}: Error finding container d87a304801c0e9f7051845724164f67c6d9d6a0ecd1e18c769cc77332da22942: Status 404 returned error can't find the container with id d87a304801c0e9f7051845724164f67c6d9d6a0ecd1e18c769cc77332da22942 Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.916499 4837 generic.go:334] "Generic (PLEG): container finished" podID="ecbc61c9-ab9c-485e-9112-bb8704851de8" containerID="80e7a9f84e3e53924280e958be2d8ebd2909130bea780f65b73b56b9eeb2548c" exitCode=0 Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.916553 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95c7q" event={"ID":"ecbc61c9-ab9c-485e-9112-bb8704851de8","Type":"ContainerDied","Data":"80e7a9f84e3e53924280e958be2d8ebd2909130bea780f65b73b56b9eeb2548c"} Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.916609 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95c7q" event={"ID":"ecbc61c9-ab9c-485e-9112-bb8704851de8","Type":"ContainerStarted","Data":"d87a304801c0e9f7051845724164f67c6d9d6a0ecd1e18c769cc77332da22942"} Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.919619 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6qtb" event={"ID":"ebbc8197-2f60-4876-8bab-ae450e22db4d","Type":"ContainerDied","Data":"9a83d8f4cff27c5d1da1eb8b95de8aad51c44fbb135c4b97c55466208565344c"} Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.919673 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.919702 4837 scope.go:117] "RemoveContainer" containerID="ae0fc7875b5423aa4eeca51206519575c5e9ed80070b4ef32750000a14fbd0df" Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.961173 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p6qtb"] Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.973492 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p6qtb"] Mar 13 12:04:37 crc kubenswrapper[4837]: I0313 12:04:37.058519 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebbc8197-2f60-4876-8bab-ae450e22db4d" path="/var/lib/kubelet/pods/ebbc8197-2f60-4876-8bab-ae450e22db4d/volumes" Mar 13 12:04:37 crc kubenswrapper[4837]: I0313 12:04:37.663893 4837 scope.go:117] "RemoveContainer" containerID="8bba29e2f9fe402d7901b43545abaa3e8580bab8bbce4d30cd8eeaffafb977c7" Mar 13 12:04:37 crc kubenswrapper[4837]: I0313 12:04:37.686547 4837 scope.go:117] "RemoveContainer" containerID="0ddcb245a681a303f4445dab08f3327e3df698349bd5573d79829fcc09b9c9ef" Mar 13 12:04:39 crc kubenswrapper[4837]: I0313 12:04:39.394596 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:41 crc kubenswrapper[4837]: I0313 12:04:41.708941 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-84c45"] Mar 13 12:04:41 crc kubenswrapper[4837]: I0313 12:04:41.709606 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-84c45" podUID="5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" containerName="registry-server" containerID="cri-o://e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2" gracePeriod=2 Mar 13 12:04:42 crc kubenswrapper[4837]: I0313 12:04:42.378323 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:42 crc kubenswrapper[4837]: I0313 12:04:42.658633 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:42 crc kubenswrapper[4837]: I0313 12:04:42.867717 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:42 crc kubenswrapper[4837]: I0313 12:04:42.963925 4837 generic.go:334] "Generic (PLEG): container finished" podID="5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" containerID="e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2" exitCode=0 Mar 13 12:04:42 crc kubenswrapper[4837]: I0313 12:04:42.963997 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:42 crc kubenswrapper[4837]: I0313 12:04:42.964085 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84c45" event={"ID":"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47","Type":"ContainerDied","Data":"e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2"} Mar 13 12:04:42 crc kubenswrapper[4837]: I0313 12:04:42.964133 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84c45" event={"ID":"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47","Type":"ContainerDied","Data":"975dad2eacabc4e891be9a4559895099caf82984d20099b8ca5021f0401addb6"} Mar 13 12:04:42 crc kubenswrapper[4837]: I0313 12:04:42.964156 4837 scope.go:117] "RemoveContainer" containerID="e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2" Mar 13 12:04:42 crc kubenswrapper[4837]: I0313 12:04:42.968756 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95c7q" event={"ID":"ecbc61c9-ab9c-485e-9112-bb8704851de8","Type":"ContainerStarted","Data":"15effd1f70e513b4a4e478354af67a4a98140eaf51a9874afc59ee39e91f07c2"} Mar 13 12:04:42 crc kubenswrapper[4837]: I0313 12:04:42.991146 4837 scope.go:117] "RemoveContainer" containerID="400082a25f5ff89be657587e2ae6b6c9469f186282f23d0b831a5031bd2d1dfb" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.025358 4837 scope.go:117] "RemoveContainer" containerID="d879a84e3b9f99d7a40acbf499a69feddf5c1973d117fb58100dc525ee864d7d" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.038538 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kfl9\" (UniqueName: \"kubernetes.io/projected/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-kube-api-access-8kfl9\") pod \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\" (UID: \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\") " Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.038598 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-utilities\") pod \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\" (UID: \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\") " Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.038699 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-catalog-content\") pod \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\" (UID: \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\") " Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.039970 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-utilities" (OuterVolumeSpecName: "utilities") pod "5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" (UID: "5b5b628a-9d8f-4ce7-b023-adbb2b00ff47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.046302 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-kube-api-access-8kfl9" (OuterVolumeSpecName: "kube-api-access-8kfl9") pod "5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" (UID: "5b5b628a-9d8f-4ce7-b023-adbb2b00ff47"). InnerVolumeSpecName "kube-api-access-8kfl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.096607 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" (UID: "5b5b628a-9d8f-4ce7-b023-adbb2b00ff47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.117157 4837 scope.go:117] "RemoveContainer" containerID="e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2" Mar 13 12:04:43 crc kubenswrapper[4837]: E0313 12:04:43.117614 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2\": container with ID starting with e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2 not found: ID does not exist" containerID="e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.117714 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2"} err="failed to get container status \"e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2\": rpc error: code = NotFound desc = could not find container \"e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2\": container with ID starting with e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2 not found: ID does not exist" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.117739 4837 scope.go:117] "RemoveContainer" containerID="400082a25f5ff89be657587e2ae6b6c9469f186282f23d0b831a5031bd2d1dfb" Mar 13 12:04:43 crc kubenswrapper[4837]: E0313 12:04:43.118092 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"400082a25f5ff89be657587e2ae6b6c9469f186282f23d0b831a5031bd2d1dfb\": container with ID starting with 400082a25f5ff89be657587e2ae6b6c9469f186282f23d0b831a5031bd2d1dfb not found: ID does not exist" containerID="400082a25f5ff89be657587e2ae6b6c9469f186282f23d0b831a5031bd2d1dfb" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.118143 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"400082a25f5ff89be657587e2ae6b6c9469f186282f23d0b831a5031bd2d1dfb"} err="failed to get container status \"400082a25f5ff89be657587e2ae6b6c9469f186282f23d0b831a5031bd2d1dfb\": rpc error: code = NotFound desc = could not find container \"400082a25f5ff89be657587e2ae6b6c9469f186282f23d0b831a5031bd2d1dfb\": container with ID starting with 400082a25f5ff89be657587e2ae6b6c9469f186282f23d0b831a5031bd2d1dfb not found: ID does not exist" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.118160 4837 scope.go:117] "RemoveContainer" containerID="d879a84e3b9f99d7a40acbf499a69feddf5c1973d117fb58100dc525ee864d7d" Mar 13 12:04:43 crc kubenswrapper[4837]: E0313 12:04:43.119669 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d879a84e3b9f99d7a40acbf499a69feddf5c1973d117fb58100dc525ee864d7d\": container with ID starting with d879a84e3b9f99d7a40acbf499a69feddf5c1973d117fb58100dc525ee864d7d not found: ID does not exist" containerID="d879a84e3b9f99d7a40acbf499a69feddf5c1973d117fb58100dc525ee864d7d" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.119699 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d879a84e3b9f99d7a40acbf499a69feddf5c1973d117fb58100dc525ee864d7d"} err="failed to get container status \"d879a84e3b9f99d7a40acbf499a69feddf5c1973d117fb58100dc525ee864d7d\": rpc error: code = NotFound desc = could not find container \"d879a84e3b9f99d7a40acbf499a69feddf5c1973d117fb58100dc525ee864d7d\": container with ID starting with d879a84e3b9f99d7a40acbf499a69feddf5c1973d117fb58100dc525ee864d7d not found: ID does not exist" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.141049 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.141121 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kfl9\" (UniqueName: \"kubernetes.io/projected/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-kube-api-access-8kfl9\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.141139 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.298254 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-84c45"] Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.306859 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-84c45"] Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.983914 4837 generic.go:334] "Generic (PLEG): container finished" podID="ecbc61c9-ab9c-485e-9112-bb8704851de8" containerID="15effd1f70e513b4a4e478354af67a4a98140eaf51a9874afc59ee39e91f07c2" exitCode=0 Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.984018 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95c7q" event={"ID":"ecbc61c9-ab9c-485e-9112-bb8704851de8","Type":"ContainerDied","Data":"15effd1f70e513b4a4e478354af67a4a98140eaf51a9874afc59ee39e91f07c2"} Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.116067 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2crl9"] Mar 13 12:04:44 crc kubenswrapper[4837]: E0313 12:04:44.116777 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebbc8197-2f60-4876-8bab-ae450e22db4d" containerName="registry-server" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.116796 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebbc8197-2f60-4876-8bab-ae450e22db4d" containerName="registry-server" Mar 13 12:04:44 crc kubenswrapper[4837]: E0313 12:04:44.116808 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebbc8197-2f60-4876-8bab-ae450e22db4d" containerName="extract-content" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.116814 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebbc8197-2f60-4876-8bab-ae450e22db4d" containerName="extract-content" Mar 13 12:04:44 crc kubenswrapper[4837]: E0313 12:04:44.116834 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebbc8197-2f60-4876-8bab-ae450e22db4d" containerName="extract-utilities" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.116843 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebbc8197-2f60-4876-8bab-ae450e22db4d" containerName="extract-utilities" Mar 13 12:04:44 crc kubenswrapper[4837]: E0313 12:04:44.116854 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" containerName="extract-utilities" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.116862 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" containerName="extract-utilities" Mar 13 12:04:44 crc kubenswrapper[4837]: E0313 12:04:44.116877 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" containerName="registry-server" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.116884 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" containerName="registry-server" Mar 13 12:04:44 crc kubenswrapper[4837]: E0313 12:04:44.116899 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" containerName="extract-content" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.116906 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" containerName="extract-content" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.117093 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebbc8197-2f60-4876-8bab-ae450e22db4d" containerName="registry-server" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.117109 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" containerName="registry-server" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.118236 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.127970 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2crl9"] Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.256356 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-utilities\") pod \"redhat-marketplace-2crl9\" (UID: \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\") " pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.256469 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7hbk\" (UniqueName: \"kubernetes.io/projected/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-kube-api-access-l7hbk\") pod \"redhat-marketplace-2crl9\" (UID: \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\") " pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.256522 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-catalog-content\") pod \"redhat-marketplace-2crl9\" (UID: \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\") " pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.357680 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7hbk\" (UniqueName: \"kubernetes.io/projected/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-kube-api-access-l7hbk\") pod \"redhat-marketplace-2crl9\" (UID: \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\") " pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.357740 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-catalog-content\") pod \"redhat-marketplace-2crl9\" (UID: \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\") " pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.357786 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-utilities\") pod \"redhat-marketplace-2crl9\" (UID: \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\") " pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.358220 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-utilities\") pod \"redhat-marketplace-2crl9\" (UID: \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\") " pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.358466 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-catalog-content\") pod \"redhat-marketplace-2crl9\" (UID: \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\") " pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.378902 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7hbk\" (UniqueName: \"kubernetes.io/projected/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-kube-api-access-l7hbk\") pod \"redhat-marketplace-2crl9\" (UID: \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\") " pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.441172 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.668205 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2crl9"] Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.993432 4837 generic.go:334] "Generic (PLEG): container finished" podID="7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" containerID="de4836aad6222fc6702dec9e38648e5ea3face7c15046e0ff57102279a2cabc9" exitCode=0 Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.993550 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2crl9" event={"ID":"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47","Type":"ContainerDied","Data":"de4836aad6222fc6702dec9e38648e5ea3face7c15046e0ff57102279a2cabc9"} Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.994115 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2crl9" event={"ID":"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47","Type":"ContainerStarted","Data":"494be8647d8bf49d20735dd5e68f3201a4a95fac989aa78a899f4c7f78bb7851"} Mar 13 12:04:45 crc kubenswrapper[4837]: I0313 12:04:45.000695 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95c7q" event={"ID":"ecbc61c9-ab9c-485e-9112-bb8704851de8","Type":"ContainerStarted","Data":"c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd"} Mar 13 12:04:45 crc kubenswrapper[4837]: I0313 12:04:45.034790 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-95c7q" podStartSLOduration=2.421715574 podStartE2EDuration="10.034762222s" podCreationTimestamp="2026-03-13 12:04:35 +0000 UTC" firstStartedPulling="2026-03-13 12:04:36.918736101 +0000 UTC m=+992.557002864" lastFinishedPulling="2026-03-13 12:04:44.531782749 +0000 UTC m=+1000.170049512" observedRunningTime="2026-03-13 12:04:45.03155 +0000 UTC m=+1000.669816773" watchObservedRunningTime="2026-03-13 12:04:45.034762222 +0000 UTC m=+1000.673028985" Mar 13 12:04:45 crc kubenswrapper[4837]: I0313 12:04:45.056979 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" path="/var/lib/kubelet/pods/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47/volumes" Mar 13 12:04:46 crc kubenswrapper[4837]: I0313 12:04:46.039138 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:46 crc kubenswrapper[4837]: I0313 12:04:46.039337 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:47 crc kubenswrapper[4837]: I0313 12:04:47.014627 4837 generic.go:334] "Generic (PLEG): container finished" podID="7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" containerID="0538e711d2d8831b415f66821458a301532affb1497219de99e29dbcb0da491b" exitCode=0 Mar 13 12:04:47 crc kubenswrapper[4837]: I0313 12:04:47.014735 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2crl9" event={"ID":"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47","Type":"ContainerDied","Data":"0538e711d2d8831b415f66821458a301532affb1497219de99e29dbcb0da491b"} Mar 13 12:04:47 crc kubenswrapper[4837]: I0313 12:04:47.081448 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-95c7q" podUID="ecbc61c9-ab9c-485e-9112-bb8704851de8" containerName="registry-server" probeResult="failure" output=< Mar 13 12:04:47 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Mar 13 12:04:47 crc kubenswrapper[4837]: > Mar 13 12:04:48 crc kubenswrapper[4837]: I0313 12:04:48.037116 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2crl9" event={"ID":"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47","Type":"ContainerStarted","Data":"f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8"} Mar 13 12:04:48 crc kubenswrapper[4837]: I0313 12:04:48.062167 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2crl9" podStartSLOduration=1.488537196 podStartE2EDuration="4.062133816s" podCreationTimestamp="2026-03-13 12:04:44 +0000 UTC" firstStartedPulling="2026-03-13 12:04:44.995806278 +0000 UTC m=+1000.634073041" lastFinishedPulling="2026-03-13 12:04:47.569402888 +0000 UTC m=+1003.207669661" observedRunningTime="2026-03-13 12:04:48.056106686 +0000 UTC m=+1003.694373449" watchObservedRunningTime="2026-03-13 12:04:48.062133816 +0000 UTC m=+1003.700400579" Mar 13 12:04:54 crc kubenswrapper[4837]: I0313 12:04:54.443165 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:54 crc kubenswrapper[4837]: I0313 12:04:54.444964 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:54 crc kubenswrapper[4837]: I0313 12:04:54.486574 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:55 crc kubenswrapper[4837]: I0313 12:04:55.121541 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:55 crc kubenswrapper[4837]: I0313 12:04:55.163370 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2crl9"] Mar 13 12:04:56 crc kubenswrapper[4837]: I0313 12:04:56.076381 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:56 crc kubenswrapper[4837]: I0313 12:04:56.117296 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.096065 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2crl9" podUID="7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" containerName="registry-server" containerID="cri-o://f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8" gracePeriod=2 Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.113744 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-95c7q"] Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.113977 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-95c7q" podUID="ecbc61c9-ab9c-485e-9112-bb8704851de8" containerName="registry-server" containerID="cri-o://c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd" gracePeriod=2 Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.589050 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.608929 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.755505 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7hbk\" (UniqueName: \"kubernetes.io/projected/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-kube-api-access-l7hbk\") pod \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\" (UID: \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\") " Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.755862 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecbc61c9-ab9c-485e-9112-bb8704851de8-catalog-content\") pod \"ecbc61c9-ab9c-485e-9112-bb8704851de8\" (UID: \"ecbc61c9-ab9c-485e-9112-bb8704851de8\") " Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.755909 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-catalog-content\") pod \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\" (UID: \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\") " Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.755929 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecbc61c9-ab9c-485e-9112-bb8704851de8-utilities\") pod \"ecbc61c9-ab9c-485e-9112-bb8704851de8\" (UID: \"ecbc61c9-ab9c-485e-9112-bb8704851de8\") " Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.755949 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvb2c\" (UniqueName: \"kubernetes.io/projected/ecbc61c9-ab9c-485e-9112-bb8704851de8-kube-api-access-vvb2c\") pod \"ecbc61c9-ab9c-485e-9112-bb8704851de8\" (UID: \"ecbc61c9-ab9c-485e-9112-bb8704851de8\") " Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.755970 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-utilities\") pod \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\" (UID: \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\") " Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.756894 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-utilities" (OuterVolumeSpecName: "utilities") pod "7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" (UID: "7ebf68eb-aa74-4cb1-8c39-11b467b1cf47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.757080 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecbc61c9-ab9c-485e-9112-bb8704851de8-utilities" (OuterVolumeSpecName: "utilities") pod "ecbc61c9-ab9c-485e-9112-bb8704851de8" (UID: "ecbc61c9-ab9c-485e-9112-bb8704851de8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.761922 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecbc61c9-ab9c-485e-9112-bb8704851de8-kube-api-access-vvb2c" (OuterVolumeSpecName: "kube-api-access-vvb2c") pod "ecbc61c9-ab9c-485e-9112-bb8704851de8" (UID: "ecbc61c9-ab9c-485e-9112-bb8704851de8"). InnerVolumeSpecName "kube-api-access-vvb2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.774944 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-kube-api-access-l7hbk" (OuterVolumeSpecName: "kube-api-access-l7hbk") pod "7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" (UID: "7ebf68eb-aa74-4cb1-8c39-11b467b1cf47"). InnerVolumeSpecName "kube-api-access-l7hbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.789258 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" (UID: "7ebf68eb-aa74-4cb1-8c39-11b467b1cf47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.857463 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecbc61c9-ab9c-485e-9112-bb8704851de8-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.857502 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvb2c\" (UniqueName: \"kubernetes.io/projected/ecbc61c9-ab9c-485e-9112-bb8704851de8-kube-api-access-vvb2c\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.857512 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.857521 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7hbk\" (UniqueName: \"kubernetes.io/projected/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-kube-api-access-l7hbk\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.857530 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.896945 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecbc61c9-ab9c-485e-9112-bb8704851de8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ecbc61c9-ab9c-485e-9112-bb8704851de8" (UID: "ecbc61c9-ab9c-485e-9112-bb8704851de8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.959004 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecbc61c9-ab9c-485e-9112-bb8704851de8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.104201 4837 generic.go:334] "Generic (PLEG): container finished" podID="ecbc61c9-ab9c-485e-9112-bb8704851de8" containerID="c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd" exitCode=0 Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.104277 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95c7q" event={"ID":"ecbc61c9-ab9c-485e-9112-bb8704851de8","Type":"ContainerDied","Data":"c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd"} Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.104321 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95c7q" event={"ID":"ecbc61c9-ab9c-485e-9112-bb8704851de8","Type":"ContainerDied","Data":"d87a304801c0e9f7051845724164f67c6d9d6a0ecd1e18c769cc77332da22942"} Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.104343 4837 scope.go:117] "RemoveContainer" containerID="c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.105354 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.106817 4837 generic.go:334] "Generic (PLEG): container finished" podID="7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" containerID="f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8" exitCode=0 Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.106968 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.107734 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2crl9" event={"ID":"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47","Type":"ContainerDied","Data":"f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8"} Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.107784 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2crl9" event={"ID":"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47","Type":"ContainerDied","Data":"494be8647d8bf49d20735dd5e68f3201a4a95fac989aa78a899f4c7f78bb7851"} Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.149255 4837 scope.go:117] "RemoveContainer" containerID="15effd1f70e513b4a4e478354af67a4a98140eaf51a9874afc59ee39e91f07c2" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.179359 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-95c7q"] Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.179554 4837 scope.go:117] "RemoveContainer" containerID="80e7a9f84e3e53924280e958be2d8ebd2909130bea780f65b73b56b9eeb2548c" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.187313 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-95c7q"] Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.199166 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2crl9"] Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.209392 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2crl9"] Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.211871 4837 scope.go:117] "RemoveContainer" containerID="c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd" Mar 13 12:04:58 crc kubenswrapper[4837]: E0313 12:04:58.212410 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd\": container with ID starting with c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd not found: ID does not exist" containerID="c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.212442 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd"} err="failed to get container status \"c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd\": rpc error: code = NotFound desc = could not find container \"c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd\": container with ID starting with c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd not found: ID does not exist" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.212464 4837 scope.go:117] "RemoveContainer" containerID="15effd1f70e513b4a4e478354af67a4a98140eaf51a9874afc59ee39e91f07c2" Mar 13 12:04:58 crc kubenswrapper[4837]: E0313 12:04:58.214048 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15effd1f70e513b4a4e478354af67a4a98140eaf51a9874afc59ee39e91f07c2\": container with ID starting with 15effd1f70e513b4a4e478354af67a4a98140eaf51a9874afc59ee39e91f07c2 not found: ID does not exist" containerID="15effd1f70e513b4a4e478354af67a4a98140eaf51a9874afc59ee39e91f07c2" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.214082 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15effd1f70e513b4a4e478354af67a4a98140eaf51a9874afc59ee39e91f07c2"} err="failed to get container status \"15effd1f70e513b4a4e478354af67a4a98140eaf51a9874afc59ee39e91f07c2\": rpc error: code = NotFound desc = could not find container \"15effd1f70e513b4a4e478354af67a4a98140eaf51a9874afc59ee39e91f07c2\": container with ID starting with 15effd1f70e513b4a4e478354af67a4a98140eaf51a9874afc59ee39e91f07c2 not found: ID does not exist" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.214105 4837 scope.go:117] "RemoveContainer" containerID="80e7a9f84e3e53924280e958be2d8ebd2909130bea780f65b73b56b9eeb2548c" Mar 13 12:04:58 crc kubenswrapper[4837]: E0313 12:04:58.214506 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80e7a9f84e3e53924280e958be2d8ebd2909130bea780f65b73b56b9eeb2548c\": container with ID starting with 80e7a9f84e3e53924280e958be2d8ebd2909130bea780f65b73b56b9eeb2548c not found: ID does not exist" containerID="80e7a9f84e3e53924280e958be2d8ebd2909130bea780f65b73b56b9eeb2548c" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.214529 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80e7a9f84e3e53924280e958be2d8ebd2909130bea780f65b73b56b9eeb2548c"} err="failed to get container status \"80e7a9f84e3e53924280e958be2d8ebd2909130bea780f65b73b56b9eeb2548c\": rpc error: code = NotFound desc = could not find container \"80e7a9f84e3e53924280e958be2d8ebd2909130bea780f65b73b56b9eeb2548c\": container with ID starting with 80e7a9f84e3e53924280e958be2d8ebd2909130bea780f65b73b56b9eeb2548c not found: ID does not exist" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.214542 4837 scope.go:117] "RemoveContainer" containerID="f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.239037 4837 scope.go:117] "RemoveContainer" containerID="0538e711d2d8831b415f66821458a301532affb1497219de99e29dbcb0da491b" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.256079 4837 scope.go:117] "RemoveContainer" containerID="de4836aad6222fc6702dec9e38648e5ea3face7c15046e0ff57102279a2cabc9" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.294808 4837 scope.go:117] "RemoveContainer" containerID="f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8" Mar 13 12:04:58 crc kubenswrapper[4837]: E0313 12:04:58.295389 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8\": container with ID starting with f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8 not found: ID does not exist" containerID="f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.295497 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8"} err="failed to get container status \"f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8\": rpc error: code = NotFound desc = could not find container \"f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8\": container with ID starting with f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8 not found: ID does not exist" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.295591 4837 scope.go:117] "RemoveContainer" containerID="0538e711d2d8831b415f66821458a301532affb1497219de99e29dbcb0da491b" Mar 13 12:04:58 crc kubenswrapper[4837]: E0313 12:04:58.296010 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0538e711d2d8831b415f66821458a301532affb1497219de99e29dbcb0da491b\": container with ID starting with 0538e711d2d8831b415f66821458a301532affb1497219de99e29dbcb0da491b not found: ID does not exist" containerID="0538e711d2d8831b415f66821458a301532affb1497219de99e29dbcb0da491b" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.296124 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0538e711d2d8831b415f66821458a301532affb1497219de99e29dbcb0da491b"} err="failed to get container status \"0538e711d2d8831b415f66821458a301532affb1497219de99e29dbcb0da491b\": rpc error: code = NotFound desc = could not find container \"0538e711d2d8831b415f66821458a301532affb1497219de99e29dbcb0da491b\": container with ID starting with 0538e711d2d8831b415f66821458a301532affb1497219de99e29dbcb0da491b not found: ID does not exist" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.296222 4837 scope.go:117] "RemoveContainer" containerID="de4836aad6222fc6702dec9e38648e5ea3face7c15046e0ff57102279a2cabc9" Mar 13 12:04:58 crc kubenswrapper[4837]: E0313 12:04:58.296580 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de4836aad6222fc6702dec9e38648e5ea3face7c15046e0ff57102279a2cabc9\": container with ID starting with de4836aad6222fc6702dec9e38648e5ea3face7c15046e0ff57102279a2cabc9 not found: ID does not exist" containerID="de4836aad6222fc6702dec9e38648e5ea3face7c15046e0ff57102279a2cabc9" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.296705 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de4836aad6222fc6702dec9e38648e5ea3face7c15046e0ff57102279a2cabc9"} err="failed to get container status \"de4836aad6222fc6702dec9e38648e5ea3face7c15046e0ff57102279a2cabc9\": rpc error: code = NotFound desc = could not find container \"de4836aad6222fc6702dec9e38648e5ea3face7c15046e0ff57102279a2cabc9\": container with ID starting with de4836aad6222fc6702dec9e38648e5ea3face7c15046e0ff57102279a2cabc9 not found: ID does not exist" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.723627 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-29x4s"] Mar 13 12:04:58 crc kubenswrapper[4837]: E0313 12:04:58.724292 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecbc61c9-ab9c-485e-9112-bb8704851de8" containerName="registry-server" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.724316 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecbc61c9-ab9c-485e-9112-bb8704851de8" containerName="registry-server" Mar 13 12:04:58 crc kubenswrapper[4837]: E0313 12:04:58.724330 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecbc61c9-ab9c-485e-9112-bb8704851de8" containerName="extract-content" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.724338 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecbc61c9-ab9c-485e-9112-bb8704851de8" containerName="extract-content" Mar 13 12:04:58 crc kubenswrapper[4837]: E0313 12:04:58.724356 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecbc61c9-ab9c-485e-9112-bb8704851de8" containerName="extract-utilities" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.724366 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecbc61c9-ab9c-485e-9112-bb8704851de8" containerName="extract-utilities" Mar 13 12:04:58 crc kubenswrapper[4837]: E0313 12:04:58.724383 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" containerName="registry-server" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.724391 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" containerName="registry-server" Mar 13 12:04:58 crc kubenswrapper[4837]: E0313 12:04:58.724403 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" containerName="extract-content" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.724411 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" containerName="extract-content" Mar 13 12:04:58 crc kubenswrapper[4837]: E0313 12:04:58.724426 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" containerName="extract-utilities" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.724433 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" containerName="extract-utilities" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.724609 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecbc61c9-ab9c-485e-9112-bb8704851de8" containerName="registry-server" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.724650 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" containerName="registry-server" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.725548 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.728159 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.728402 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.729998 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.735994 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-29x4s"] Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.736119 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-ng87d" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.787966 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4pw9n"] Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.789412 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.792397 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.800179 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4pw9n"] Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.871833 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6rdw\" (UniqueName: \"kubernetes.io/projected/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-kube-api-access-d6rdw\") pod \"dnsmasq-dns-78dd6ddcc-4pw9n\" (UID: \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.872107 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-config\") pod \"dnsmasq-dns-78dd6ddcc-4pw9n\" (UID: \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.872248 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63688ba3-e68c-4f88-a6e4-6c373b30f929-config\") pod \"dnsmasq-dns-675f4bcbfc-29x4s\" (UID: \"63688ba3-e68c-4f88-a6e4-6c373b30f929\") " pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.872324 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wlx2\" (UniqueName: \"kubernetes.io/projected/63688ba3-e68c-4f88-a6e4-6c373b30f929-kube-api-access-6wlx2\") pod \"dnsmasq-dns-675f4bcbfc-29x4s\" (UID: \"63688ba3-e68c-4f88-a6e4-6c373b30f929\") " pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.872432 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4pw9n\" (UID: \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.973115 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6rdw\" (UniqueName: \"kubernetes.io/projected/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-kube-api-access-d6rdw\") pod \"dnsmasq-dns-78dd6ddcc-4pw9n\" (UID: \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.973179 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-config\") pod \"dnsmasq-dns-78dd6ddcc-4pw9n\" (UID: \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.973213 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63688ba3-e68c-4f88-a6e4-6c373b30f929-config\") pod \"dnsmasq-dns-675f4bcbfc-29x4s\" (UID: \"63688ba3-e68c-4f88-a6e4-6c373b30f929\") " pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.973235 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wlx2\" (UniqueName: \"kubernetes.io/projected/63688ba3-e68c-4f88-a6e4-6c373b30f929-kube-api-access-6wlx2\") pod \"dnsmasq-dns-675f4bcbfc-29x4s\" (UID: \"63688ba3-e68c-4f88-a6e4-6c373b30f929\") " pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.973265 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4pw9n\" (UID: \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.974108 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4pw9n\" (UID: \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.974510 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-config\") pod \"dnsmasq-dns-78dd6ddcc-4pw9n\" (UID: \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.974509 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63688ba3-e68c-4f88-a6e4-6c373b30f929-config\") pod \"dnsmasq-dns-675f4bcbfc-29x4s\" (UID: \"63688ba3-e68c-4f88-a6e4-6c373b30f929\") " pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.990557 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6rdw\" (UniqueName: \"kubernetes.io/projected/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-kube-api-access-d6rdw\") pod \"dnsmasq-dns-78dd6ddcc-4pw9n\" (UID: \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.990577 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wlx2\" (UniqueName: \"kubernetes.io/projected/63688ba3-e68c-4f88-a6e4-6c373b30f929-kube-api-access-6wlx2\") pod \"dnsmasq-dns-675f4bcbfc-29x4s\" (UID: \"63688ba3-e68c-4f88-a6e4-6c373b30f929\") " pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" Mar 13 12:04:59 crc kubenswrapper[4837]: I0313 12:04:59.043571 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" Mar 13 12:04:59 crc kubenswrapper[4837]: I0313 12:04:59.056237 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" path="/var/lib/kubelet/pods/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47/volumes" Mar 13 12:04:59 crc kubenswrapper[4837]: I0313 12:04:59.057149 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecbc61c9-ab9c-485e-9112-bb8704851de8" path="/var/lib/kubelet/pods/ecbc61c9-ab9c-485e-9112-bb8704851de8/volumes" Mar 13 12:04:59 crc kubenswrapper[4837]: I0313 12:04:59.120599 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:04:59 crc kubenswrapper[4837]: I0313 12:04:59.445905 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-29x4s"] Mar 13 12:04:59 crc kubenswrapper[4837]: I0313 12:04:59.582101 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4pw9n"] Mar 13 12:05:00 crc kubenswrapper[4837]: I0313 12:05:00.137167 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" event={"ID":"5b9562f6-0527-40b4-9b2e-f5b2f22aa272","Type":"ContainerStarted","Data":"da674bf6aef47158bcb9f95c5eb9d1a420c65f8f1989031a5ce339d17724e353"} Mar 13 12:05:00 crc kubenswrapper[4837]: I0313 12:05:00.138617 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" event={"ID":"63688ba3-e68c-4f88-a6e4-6c373b30f929","Type":"ContainerStarted","Data":"a94e60840947634f6bf1cb267e8fd71afc4b1db7581a3cc51184314c3d6b19e7"} Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.654070 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-29x4s"] Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.688941 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-7wbz7"] Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.690337 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.700421 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-7wbz7"] Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.717164 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b082689f-6a6d-4da0-b2b1-f78343ba1e85-config\") pod \"dnsmasq-dns-5ccc8479f9-7wbz7\" (UID: \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.717286 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcjgv\" (UniqueName: \"kubernetes.io/projected/b082689f-6a6d-4da0-b2b1-f78343ba1e85-kube-api-access-bcjgv\") pod \"dnsmasq-dns-5ccc8479f9-7wbz7\" (UID: \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.717313 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b082689f-6a6d-4da0-b2b1-f78343ba1e85-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-7wbz7\" (UID: \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.819079 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcjgv\" (UniqueName: \"kubernetes.io/projected/b082689f-6a6d-4da0-b2b1-f78343ba1e85-kube-api-access-bcjgv\") pod \"dnsmasq-dns-5ccc8479f9-7wbz7\" (UID: \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.819135 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b082689f-6a6d-4da0-b2b1-f78343ba1e85-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-7wbz7\" (UID: \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.819161 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b082689f-6a6d-4da0-b2b1-f78343ba1e85-config\") pod \"dnsmasq-dns-5ccc8479f9-7wbz7\" (UID: \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.820213 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b082689f-6a6d-4da0-b2b1-f78343ba1e85-config\") pod \"dnsmasq-dns-5ccc8479f9-7wbz7\" (UID: \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.822058 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b082689f-6a6d-4da0-b2b1-f78343ba1e85-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-7wbz7\" (UID: \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.848161 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcjgv\" (UniqueName: \"kubernetes.io/projected/b082689f-6a6d-4da0-b2b1-f78343ba1e85-kube-api-access-bcjgv\") pod \"dnsmasq-dns-5ccc8479f9-7wbz7\" (UID: \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.939189 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4pw9n"] Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.963137 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7vs6f"] Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.966806 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.982372 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7vs6f"] Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.011982 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.022622 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-config\") pod \"dnsmasq-dns-57d769cc4f-7vs6f\" (UID: \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\") " pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.022730 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhg95\" (UniqueName: \"kubernetes.io/projected/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-kube-api-access-lhg95\") pod \"dnsmasq-dns-57d769cc4f-7vs6f\" (UID: \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\") " pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.022752 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-7vs6f\" (UID: \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\") " pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.124427 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-config\") pod \"dnsmasq-dns-57d769cc4f-7vs6f\" (UID: \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\") " pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.124469 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhg95\" (UniqueName: \"kubernetes.io/projected/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-kube-api-access-lhg95\") pod \"dnsmasq-dns-57d769cc4f-7vs6f\" (UID: \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\") " pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.125842 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-7vs6f\" (UID: \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\") " pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.127659 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-config\") pod \"dnsmasq-dns-57d769cc4f-7vs6f\" (UID: \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\") " pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.128375 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-7vs6f\" (UID: \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\") " pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.149108 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhg95\" (UniqueName: \"kubernetes.io/projected/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-kube-api-access-lhg95\") pod \"dnsmasq-dns-57d769cc4f-7vs6f\" (UID: \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\") " pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.291342 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.822946 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.824358 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.832713 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.832750 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.832920 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mb2tp" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.833025 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.833121 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.833217 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.833307 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.837219 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.939261 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.939653 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfz87\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-kube-api-access-wfz87\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.939686 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.939726 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.939769 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.939810 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.939842 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.939866 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13254c8b-516c-435e-9db2-a8d518434f29-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.939891 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13254c8b-516c-435e-9db2-a8d518434f29-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.939974 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.940153 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.041436 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.041506 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.041548 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfz87\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-kube-api-access-wfz87\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.041573 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.041604 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.041627 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.041685 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.041706 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.041730 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13254c8b-516c-435e-9db2-a8d518434f29-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.041755 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13254c8b-516c-435e-9db2-a8d518434f29-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.041782 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.042342 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.042371 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.042754 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.043059 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.043278 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.044078 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.046860 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.046941 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13254c8b-516c-435e-9db2-a8d518434f29-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.053658 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.054439 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13254c8b-516c-435e-9db2-a8d518434f29-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.058909 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfz87\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-kube-api-access-wfz87\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.070534 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.117476 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.119933 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.122984 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.123323 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.123429 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.123719 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.124203 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8bxdt" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.124788 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.129470 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.144682 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.158365 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.243742 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e7b01be4-73b6-48eb-a06d-4fb38863d982-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.243795 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.243817 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.243831 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-config-data\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.243860 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.243873 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.243892 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.243948 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.243964 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.243987 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl7pd\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-kube-api-access-pl7pd\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.244005 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e7b01be4-73b6-48eb-a06d-4fb38863d982-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.345030 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.345085 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.345111 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.345175 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.345192 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.345215 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl7pd\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-kube-api-access-pl7pd\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.345234 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e7b01be4-73b6-48eb-a06d-4fb38863d982-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.345258 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e7b01be4-73b6-48eb-a06d-4fb38863d982-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.345278 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.345301 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.345317 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-config-data\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.346191 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-config-data\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.347251 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.348497 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.349348 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.349455 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.349491 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.349659 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.351826 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e7b01be4-73b6-48eb-a06d-4fb38863d982-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.354025 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.360598 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e7b01be4-73b6-48eb-a06d-4fb38863d982-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.371449 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.379340 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl7pd\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-kube-api-access-pl7pd\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.453791 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.417437 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.420150 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.422576 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.422805 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.422947 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-nz226" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.426529 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.430170 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.433051 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.563775 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.563897 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.563939 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncdgz\" (UniqueName: \"kubernetes.io/projected/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-kube-api-access-ncdgz\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.563966 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-config-data-default\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.563997 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.564203 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-kolla-config\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.564341 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.564393 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.666209 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.666305 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.666339 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncdgz\" (UniqueName: \"kubernetes.io/projected/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-kube-api-access-ncdgz\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.666366 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-config-data-default\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.666399 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.666480 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-kolla-config\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.666525 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.666554 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.666689 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.667465 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-kolla-config\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.667620 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.667871 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-config-data-default\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.667891 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.671591 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.677317 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.694768 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.702248 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncdgz\" (UniqueName: \"kubernetes.io/projected/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-kube-api-access-ncdgz\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.745310 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.484300 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.484618 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.484736 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.485771 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"62df99fa64e257c350cea1390039e0bd2f2c672bf6d80836ec3df94beec3d8d1"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.485835 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://62df99fa64e257c350cea1390039e0bd2f2c672bf6d80836ec3df94beec3d8d1" gracePeriod=600 Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.792323 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.802864 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.805340 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.813696 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.813915 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-5bq6m" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.814100 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.814877 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.886837 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/362e31d4-ea62-40ed-8426-982d47559472-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.886930 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/362e31d4-ea62-40ed-8426-982d47559472-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.886980 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/362e31d4-ea62-40ed-8426-982d47559472-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.887012 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/362e31d4-ea62-40ed-8426-982d47559472-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.887039 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.887066 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq4vp\" (UniqueName: \"kubernetes.io/projected/362e31d4-ea62-40ed-8426-982d47559472-kube-api-access-dq4vp\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.887232 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/362e31d4-ea62-40ed-8426-982d47559472-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.887285 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/362e31d4-ea62-40ed-8426-982d47559472-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.913483 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.914612 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.918077 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.918890 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.919064 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-8shmp" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.952430 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.988992 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq4vp\" (UniqueName: \"kubernetes.io/projected/362e31d4-ea62-40ed-8426-982d47559472-kube-api-access-dq4vp\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.989046 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae39431b-5fa4-4a09-b76f-44b4d256c129-config-data\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.989065 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae39431b-5fa4-4a09-b76f-44b4d256c129-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.989084 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/362e31d4-ea62-40ed-8426-982d47559472-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.989101 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/362e31d4-ea62-40ed-8426-982d47559472-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.989141 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/362e31d4-ea62-40ed-8426-982d47559472-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.989478 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/362e31d4-ea62-40ed-8426-982d47559472-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.990019 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/362e31d4-ea62-40ed-8426-982d47559472-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.990659 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/362e31d4-ea62-40ed-8426-982d47559472-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.994092 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/362e31d4-ea62-40ed-8426-982d47559472-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.994231 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/362e31d4-ea62-40ed-8426-982d47559472-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.994390 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvsbl\" (UniqueName: \"kubernetes.io/projected/ae39431b-5fa4-4a09-b76f-44b4d256c129-kube-api-access-lvsbl\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.994441 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/362e31d4-ea62-40ed-8426-982d47559472-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.994562 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae39431b-5fa4-4a09-b76f-44b4d256c129-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.994705 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.994758 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ae39431b-5fa4-4a09-b76f-44b4d256c129-kolla-config\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.994899 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/362e31d4-ea62-40ed-8426-982d47559472-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.994905 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/362e31d4-ea62-40ed-8426-982d47559472-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.995436 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.995561 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/362e31d4-ea62-40ed-8426-982d47559472-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.009966 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq4vp\" (UniqueName: \"kubernetes.io/projected/362e31d4-ea62-40ed-8426-982d47559472-kube-api-access-dq4vp\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.022623 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.096965 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvsbl\" (UniqueName: \"kubernetes.io/projected/ae39431b-5fa4-4a09-b76f-44b4d256c129-kube-api-access-lvsbl\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.097037 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae39431b-5fa4-4a09-b76f-44b4d256c129-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.097067 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ae39431b-5fa4-4a09-b76f-44b4d256c129-kolla-config\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.097099 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae39431b-5fa4-4a09-b76f-44b4d256c129-config-data\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.097140 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae39431b-5fa4-4a09-b76f-44b4d256c129-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.098952 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae39431b-5fa4-4a09-b76f-44b4d256c129-config-data\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.099119 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ae39431b-5fa4-4a09-b76f-44b4d256c129-kolla-config\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.102404 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae39431b-5fa4-4a09-b76f-44b4d256c129-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.109934 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae39431b-5fa4-4a09-b76f-44b4d256c129-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.120827 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvsbl\" (UniqueName: \"kubernetes.io/projected/ae39431b-5fa4-4a09-b76f-44b4d256c129-kube-api-access-lvsbl\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.184333 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="62df99fa64e257c350cea1390039e0bd2f2c672bf6d80836ec3df94beec3d8d1" exitCode=0 Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.184387 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"62df99fa64e257c350cea1390039e0bd2f2c672bf6d80836ec3df94beec3d8d1"} Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.184427 4837 scope.go:117] "RemoveContainer" containerID="86010f8ae6e03e22840b0db405e4816a52e1a80af0eff6188dd5d3d81e63937a" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.186663 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.238040 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 12:05:08 crc kubenswrapper[4837]: I0313 12:05:08.093229 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:05:08 crc kubenswrapper[4837]: I0313 12:05:08.094787 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 12:05:08 crc kubenswrapper[4837]: I0313 12:05:08.096700 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-ct7wn" Mar 13 12:05:08 crc kubenswrapper[4837]: I0313 12:05:08.106254 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:05:08 crc kubenswrapper[4837]: I0313 12:05:08.138589 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vwh6\" (UniqueName: \"kubernetes.io/projected/a250849d-ca15-40fa-8b1d-a32b5abc6861-kube-api-access-9vwh6\") pod \"kube-state-metrics-0\" (UID: \"a250849d-ca15-40fa-8b1d-a32b5abc6861\") " pod="openstack/kube-state-metrics-0" Mar 13 12:05:08 crc kubenswrapper[4837]: I0313 12:05:08.239733 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vwh6\" (UniqueName: \"kubernetes.io/projected/a250849d-ca15-40fa-8b1d-a32b5abc6861-kube-api-access-9vwh6\") pod \"kube-state-metrics-0\" (UID: \"a250849d-ca15-40fa-8b1d-a32b5abc6861\") " pod="openstack/kube-state-metrics-0" Mar 13 12:05:08 crc kubenswrapper[4837]: I0313 12:05:08.268941 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vwh6\" (UniqueName: \"kubernetes.io/projected/a250849d-ca15-40fa-8b1d-a32b5abc6861-kube-api-access-9vwh6\") pod \"kube-state-metrics-0\" (UID: \"a250849d-ca15-40fa-8b1d-a32b5abc6861\") " pod="openstack/kube-state-metrics-0" Mar 13 12:05:08 crc kubenswrapper[4837]: I0313 12:05:08.422275 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 12:05:10 crc kubenswrapper[4837]: I0313 12:05:10.224961 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-7wbz7"] Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.607902 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nbhpw"] Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.614623 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.618439 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nbhpw"] Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.620318 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-hlqhf" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.620600 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.620807 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.671466 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-ls998"] Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.673361 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.684618 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-ls998"] Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.704049 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw65n\" (UniqueName: \"kubernetes.io/projected/32dc51d9-5638-4530-91c8-5be8c13e60f3-kube-api-access-dw65n\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.704142 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/32dc51d9-5638-4530-91c8-5be8c13e60f3-var-run-ovn\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.704174 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dc51d9-5638-4530-91c8-5be8c13e60f3-ovn-controller-tls-certs\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.704223 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dc51d9-5638-4530-91c8-5be8c13e60f3-combined-ca-bundle\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.704250 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/32dc51d9-5638-4530-91c8-5be8c13e60f3-var-log-ovn\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.704282 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/32dc51d9-5638-4530-91c8-5be8c13e60f3-var-run\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.704305 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32dc51d9-5638-4530-91c8-5be8c13e60f3-scripts\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.805955 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dc51d9-5638-4530-91c8-5be8c13e60f3-ovn-controller-tls-certs\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.805999 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/71e00962-6b2f-495c-8f34-52993f66cef9-var-lib\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.806046 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dc51d9-5638-4530-91c8-5be8c13e60f3-combined-ca-bundle\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.806069 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/32dc51d9-5638-4530-91c8-5be8c13e60f3-var-log-ovn\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.806096 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/32dc51d9-5638-4530-91c8-5be8c13e60f3-var-run\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.806114 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32dc51d9-5638-4530-91c8-5be8c13e60f3-scripts\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.806137 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw65n\" (UniqueName: \"kubernetes.io/projected/32dc51d9-5638-4530-91c8-5be8c13e60f3-kube-api-access-dw65n\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.806153 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/71e00962-6b2f-495c-8f34-52993f66cef9-var-run\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.806177 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxp7g\" (UniqueName: \"kubernetes.io/projected/71e00962-6b2f-495c-8f34-52993f66cef9-kube-api-access-mxp7g\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.806199 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/71e00962-6b2f-495c-8f34-52993f66cef9-var-log\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.806226 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71e00962-6b2f-495c-8f34-52993f66cef9-scripts\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.806248 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/71e00962-6b2f-495c-8f34-52993f66cef9-etc-ovs\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.806274 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/32dc51d9-5638-4530-91c8-5be8c13e60f3-var-run-ovn\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.807311 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/32dc51d9-5638-4530-91c8-5be8c13e60f3-var-run-ovn\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.807377 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/32dc51d9-5638-4530-91c8-5be8c13e60f3-var-log-ovn\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.807406 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/32dc51d9-5638-4530-91c8-5be8c13e60f3-var-run\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.808467 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32dc51d9-5638-4530-91c8-5be8c13e60f3-scripts\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.815395 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dc51d9-5638-4530-91c8-5be8c13e60f3-ovn-controller-tls-certs\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.816342 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dc51d9-5638-4530-91c8-5be8c13e60f3-combined-ca-bundle\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.826354 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw65n\" (UniqueName: \"kubernetes.io/projected/32dc51d9-5638-4530-91c8-5be8c13e60f3-kube-api-access-dw65n\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.907682 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/71e00962-6b2f-495c-8f34-52993f66cef9-etc-ovs\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.907768 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/71e00962-6b2f-495c-8f34-52993f66cef9-var-lib\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.907862 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/71e00962-6b2f-495c-8f34-52993f66cef9-var-run\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.907898 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxp7g\" (UniqueName: \"kubernetes.io/projected/71e00962-6b2f-495c-8f34-52993f66cef9-kube-api-access-mxp7g\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.907926 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/71e00962-6b2f-495c-8f34-52993f66cef9-var-log\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.907957 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71e00962-6b2f-495c-8f34-52993f66cef9-scripts\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.908008 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/71e00962-6b2f-495c-8f34-52993f66cef9-etc-ovs\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.908079 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/71e00962-6b2f-495c-8f34-52993f66cef9-var-run\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.908226 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/71e00962-6b2f-495c-8f34-52993f66cef9-var-lib\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.908619 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/71e00962-6b2f-495c-8f34-52993f66cef9-var-log\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.910277 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71e00962-6b2f-495c-8f34-52993f66cef9-scripts\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.934353 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxp7g\" (UniqueName: \"kubernetes.io/projected/71e00962-6b2f-495c-8f34-52993f66cef9-kube-api-access-mxp7g\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.934825 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:12 crc kubenswrapper[4837]: I0313 12:05:12.004000 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.692207 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.693973 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.696176 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-rlhxr" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.696207 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.696222 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.696514 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.703403 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.707006 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.754369 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d61ffe-3c44-4657-bc91-d849f766a3e1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.754423 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.754459 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38d61ffe-3c44-4657-bc91-d849f766a3e1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.754486 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38d61ffe-3c44-4657-bc91-d849f766a3e1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.754503 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbxmh\" (UniqueName: \"kubernetes.io/projected/38d61ffe-3c44-4657-bc91-d849f766a3e1-kube-api-access-fbxmh\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.754531 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d61ffe-3c44-4657-bc91-d849f766a3e1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.754564 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38d61ffe-3c44-4657-bc91-d849f766a3e1-config\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.754600 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/38d61ffe-3c44-4657-bc91-d849f766a3e1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.856486 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38d61ffe-3c44-4657-bc91-d849f766a3e1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.856540 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38d61ffe-3c44-4657-bc91-d849f766a3e1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.856574 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbxmh\" (UniqueName: \"kubernetes.io/projected/38d61ffe-3c44-4657-bc91-d849f766a3e1-kube-api-access-fbxmh\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.856612 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d61ffe-3c44-4657-bc91-d849f766a3e1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.856673 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38d61ffe-3c44-4657-bc91-d849f766a3e1-config\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.856724 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/38d61ffe-3c44-4657-bc91-d849f766a3e1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.856807 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d61ffe-3c44-4657-bc91-d849f766a3e1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.856843 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.857300 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.861345 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/38d61ffe-3c44-4657-bc91-d849f766a3e1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.862193 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38d61ffe-3c44-4657-bc91-d849f766a3e1-config\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.863084 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38d61ffe-3c44-4657-bc91-d849f766a3e1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.867908 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d61ffe-3c44-4657-bc91-d849f766a3e1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.869972 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38d61ffe-3c44-4657-bc91-d849f766a3e1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.870736 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d61ffe-3c44-4657-bc91-d849f766a3e1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.880397 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbxmh\" (UniqueName: \"kubernetes.io/projected/38d61ffe-3c44-4657-bc91-d849f766a3e1-kube-api-access-fbxmh\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.882515 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.884353 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.888243 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.888824 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.889123 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-nmsn4" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.889558 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.892714 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.893036 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.957807 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d10fcb0-4d45-45bf-a663-971b8ce74010-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.957874 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d10fcb0-4d45-45bf-a663-971b8ce74010-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.957910 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xpnw\" (UniqueName: \"kubernetes.io/projected/3d10fcb0-4d45-45bf-a663-971b8ce74010-kube-api-access-2xpnw\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.958267 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d10fcb0-4d45-45bf-a663-971b8ce74010-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.958305 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.958343 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d10fcb0-4d45-45bf-a663-971b8ce74010-config\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.958365 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3d10fcb0-4d45-45bf-a663-971b8ce74010-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.958388 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d10fcb0-4d45-45bf-a663-971b8ce74010-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.962419 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.020284 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.059312 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d10fcb0-4d45-45bf-a663-971b8ce74010-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.059360 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xpnw\" (UniqueName: \"kubernetes.io/projected/3d10fcb0-4d45-45bf-a663-971b8ce74010-kube-api-access-2xpnw\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.059412 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d10fcb0-4d45-45bf-a663-971b8ce74010-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.059436 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.059468 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d10fcb0-4d45-45bf-a663-971b8ce74010-config\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.059491 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3d10fcb0-4d45-45bf-a663-971b8ce74010-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.059511 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d10fcb0-4d45-45bf-a663-971b8ce74010-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.059546 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d10fcb0-4d45-45bf-a663-971b8ce74010-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.060218 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.060514 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3d10fcb0-4d45-45bf-a663-971b8ce74010-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.069200 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d10fcb0-4d45-45bf-a663-971b8ce74010-config\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.073412 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d10fcb0-4d45-45bf-a663-971b8ce74010-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.077855 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d10fcb0-4d45-45bf-a663-971b8ce74010-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.079548 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d10fcb0-4d45-45bf-a663-971b8ce74010-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.098192 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d10fcb0-4d45-45bf-a663-971b8ce74010-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.100775 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xpnw\" (UniqueName: \"kubernetes.io/projected/3d10fcb0-4d45-45bf-a663-971b8ce74010-kube-api-access-2xpnw\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.104278 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.255581 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.281792 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" event={"ID":"b082689f-6a6d-4da0-b2b1-f78343ba1e85","Type":"ContainerStarted","Data":"cf45881769a320d80a0b475b78878572d21663b4dd8fafe7ae1c9681a95c4a07"} Mar 13 12:05:15 crc kubenswrapper[4837]: W0313 12:05:15.764494 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7b01be4_73b6_48eb_a06d_4fb38863d982.slice/crio-c8180a84e0af5653dde0ac3c7b4b0a9aa55749048023693725605b5733ff15c7 WatchSource:0}: Error finding container c8180a84e0af5653dde0ac3c7b4b0a9aa55749048023693725605b5733ff15c7: Status 404 returned error can't find the container with id c8180a84e0af5653dde0ac3c7b4b0a9aa55749048023693725605b5733ff15c7 Mar 13 12:05:15 crc kubenswrapper[4837]: E0313 12:05:15.792618 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 13 12:05:15 crc kubenswrapper[4837]: E0313 12:05:15.792853 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6wlx2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-29x4s_openstack(63688ba3-e68c-4f88-a6e4-6c373b30f929): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:05:15 crc kubenswrapper[4837]: E0313 12:05:15.794151 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" podUID="63688ba3-e68c-4f88-a6e4-6c373b30f929" Mar 13 12:05:15 crc kubenswrapper[4837]: E0313 12:05:15.841303 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 13 12:05:15 crc kubenswrapper[4837]: E0313 12:05:15.841499 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d6rdw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-4pw9n_openstack(5b9562f6-0527-40b4-9b2e-f5b2f22aa272): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:05:15 crc kubenswrapper[4837]: E0313 12:05:15.842891 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" podUID="5b9562f6-0527-40b4-9b2e-f5b2f22aa272" Mar 13 12:05:16 crc kubenswrapper[4837]: I0313 12:05:16.304131 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e7b01be4-73b6-48eb-a06d-4fb38863d982","Type":"ContainerStarted","Data":"c8180a84e0af5653dde0ac3c7b4b0a9aa55749048023693725605b5733ff15c7"} Mar 13 12:05:16 crc kubenswrapper[4837]: I0313 12:05:16.311555 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"75c6e15833f1c4c6d83b741f42f4ce0c9378844641d1d149fd75349d257dfc71"} Mar 13 12:05:16 crc kubenswrapper[4837]: I0313 12:05:16.457724 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:05:16 crc kubenswrapper[4837]: I0313 12:05:16.476518 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 12:05:16 crc kubenswrapper[4837]: W0313 12:05:16.484584 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13254c8b_516c_435e_9db2_a8d518434f29.slice/crio-5d7d6eb76793e1d7753bbea8ba4648a937e2549b33bfd032cf29e8f6d1e62f4c WatchSource:0}: Error finding container 5d7d6eb76793e1d7753bbea8ba4648a937e2549b33bfd032cf29e8f6d1e62f4c: Status 404 returned error can't find the container with id 5d7d6eb76793e1d7753bbea8ba4648a937e2549b33bfd032cf29e8f6d1e62f4c Mar 13 12:05:16 crc kubenswrapper[4837]: I0313 12:05:16.885484 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 12:05:16 crc kubenswrapper[4837]: I0313 12:05:16.892779 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:05:16 crc kubenswrapper[4837]: I0313 12:05:16.924443 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" Mar 13 12:05:16 crc kubenswrapper[4837]: I0313 12:05:16.932482 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7vs6f"] Mar 13 12:05:16 crc kubenswrapper[4837]: I0313 12:05:16.956524 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 12:05:16 crc kubenswrapper[4837]: I0313 12:05:16.980798 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.023864 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-config\") pod \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\" (UID: \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\") " Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.023925 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wlx2\" (UniqueName: \"kubernetes.io/projected/63688ba3-e68c-4f88-a6e4-6c373b30f929-kube-api-access-6wlx2\") pod \"63688ba3-e68c-4f88-a6e4-6c373b30f929\" (UID: \"63688ba3-e68c-4f88-a6e4-6c373b30f929\") " Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.023995 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6rdw\" (UniqueName: \"kubernetes.io/projected/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-kube-api-access-d6rdw\") pod \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\" (UID: \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\") " Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.024019 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-dns-svc\") pod \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\" (UID: \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\") " Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.024034 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63688ba3-e68c-4f88-a6e4-6c373b30f929-config\") pod \"63688ba3-e68c-4f88-a6e4-6c373b30f929\" (UID: \"63688ba3-e68c-4f88-a6e4-6c373b30f929\") " Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.024836 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63688ba3-e68c-4f88-a6e4-6c373b30f929-config" (OuterVolumeSpecName: "config") pod "63688ba3-e68c-4f88-a6e4-6c373b30f929" (UID: "63688ba3-e68c-4f88-a6e4-6c373b30f929"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.024879 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-config" (OuterVolumeSpecName: "config") pod "5b9562f6-0527-40b4-9b2e-f5b2f22aa272" (UID: "5b9562f6-0527-40b4-9b2e-f5b2f22aa272"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.026152 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5b9562f6-0527-40b4-9b2e-f5b2f22aa272" (UID: "5b9562f6-0527-40b4-9b2e-f5b2f22aa272"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.035983 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63688ba3-e68c-4f88-a6e4-6c373b30f929-kube-api-access-6wlx2" (OuterVolumeSpecName: "kube-api-access-6wlx2") pod "63688ba3-e68c-4f88-a6e4-6c373b30f929" (UID: "63688ba3-e68c-4f88-a6e4-6c373b30f929"). InnerVolumeSpecName "kube-api-access-6wlx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.047921 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-kube-api-access-d6rdw" (OuterVolumeSpecName: "kube-api-access-d6rdw") pod "5b9562f6-0527-40b4-9b2e-f5b2f22aa272" (UID: "5b9562f6-0527-40b4-9b2e-f5b2f22aa272"). InnerVolumeSpecName "kube-api-access-d6rdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.112229 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-ls998"] Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.118884 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nbhpw"] Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.125628 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wlx2\" (UniqueName: \"kubernetes.io/projected/63688ba3-e68c-4f88-a6e4-6c373b30f929-kube-api-access-6wlx2\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.125690 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6rdw\" (UniqueName: \"kubernetes.io/projected/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-kube-api-access-d6rdw\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.125705 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.125717 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63688ba3-e68c-4f88-a6e4-6c373b30f929-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.125728 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:17 crc kubenswrapper[4837]: W0313 12:05:17.141939 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadb9ab64_aa4b_45f4_8738_0ed74c3ed2bd.slice/crio-f98462c5662e516a739e1682b280767c7d46d013e84a15caaf3ddb8663b73d9a WatchSource:0}: Error finding container f98462c5662e516a739e1682b280767c7d46d013e84a15caaf3ddb8663b73d9a: Status 404 returned error can't find the container with id f98462c5662e516a739e1682b280767c7d46d013e84a15caaf3ddb8663b73d9a Mar 13 12:05:17 crc kubenswrapper[4837]: W0313 12:05:17.145201 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae5deee0_59c4_4fa7_8d8c_e12b516885dc.slice/crio-dac372c91256bbeabdb4ee95a6241431746c1c91b7bb9f40ca6c3bd206fe1f51 WatchSource:0}: Error finding container dac372c91256bbeabdb4ee95a6241431746c1c91b7bb9f40ca6c3bd206fe1f51: Status 404 returned error can't find the container with id dac372c91256bbeabdb4ee95a6241431746c1c91b7bb9f40ca6c3bd206fe1f51 Mar 13 12:05:17 crc kubenswrapper[4837]: W0313 12:05:17.150723 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae39431b_5fa4_4a09_b76f_44b4d256c129.slice/crio-eb9226cf039456e6835939bdf8a4b2eeafcc714a35f2dde06005ef4ba14f3c23 WatchSource:0}: Error finding container eb9226cf039456e6835939bdf8a4b2eeafcc714a35f2dde06005ef4ba14f3c23: Status 404 returned error can't find the container with id eb9226cf039456e6835939bdf8a4b2eeafcc714a35f2dde06005ef4ba14f3c23 Mar 13 12:05:17 crc kubenswrapper[4837]: W0313 12:05:17.155212 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32dc51d9_5638_4530_91c8_5be8c13e60f3.slice/crio-0789b60c660687f217e3da05e0adc970132c74550fab068133b05224809ee33f WatchSource:0}: Error finding container 0789b60c660687f217e3da05e0adc970132c74550fab068133b05224809ee33f: Status 404 returned error can't find the container with id 0789b60c660687f217e3da05e0adc970132c74550fab068133b05224809ee33f Mar 13 12:05:17 crc kubenswrapper[4837]: W0313 12:05:17.155729 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71e00962_6b2f_495c_8f34_52993f66cef9.slice/crio-5b6e7b77ac7881a09bdbef9dacc95d17e67aaa52c4bc3122591a1f9bfca019c3 WatchSource:0}: Error finding container 5b6e7b77ac7881a09bdbef9dacc95d17e67aaa52c4bc3122591a1f9bfca019c3: Status 404 returned error can't find the container with id 5b6e7b77ac7881a09bdbef9dacc95d17e67aaa52c4bc3122591a1f9bfca019c3 Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.319887 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ae39431b-5fa4-4a09-b76f-44b4d256c129","Type":"ContainerStarted","Data":"eb9226cf039456e6835939bdf8a4b2eeafcc714a35f2dde06005ef4ba14f3c23"} Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.321488 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"13254c8b-516c-435e-9db2-a8d518434f29","Type":"ContainerStarted","Data":"5d7d6eb76793e1d7753bbea8ba4648a937e2549b33bfd032cf29e8f6d1e62f4c"} Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.323482 4837 generic.go:334] "Generic (PLEG): container finished" podID="b082689f-6a6d-4da0-b2b1-f78343ba1e85" containerID="31ac9d4659c535248d95613eed33e384ae2a9e10c0405e3e50bc5460cd46aeea" exitCode=0 Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.323541 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" event={"ID":"b082689f-6a6d-4da0-b2b1-f78343ba1e85","Type":"ContainerDied","Data":"31ac9d4659c535248d95613eed33e384ae2a9e10c0405e3e50bc5460cd46aeea"} Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.325368 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" event={"ID":"5b9562f6-0527-40b4-9b2e-f5b2f22aa272","Type":"ContainerDied","Data":"da674bf6aef47158bcb9f95c5eb9d1a420c65f8f1989031a5ce339d17724e353"} Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.325666 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.329797 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"362e31d4-ea62-40ed-8426-982d47559472","Type":"ContainerStarted","Data":"ddc538520d8e600b4e67cdf449278e1a15b98855f0bcdb3070e43bd4632a4dc3"} Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.339338 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd","Type":"ContainerStarted","Data":"f98462c5662e516a739e1682b280767c7d46d013e84a15caaf3ddb8663b73d9a"} Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.368235 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" event={"ID":"ae5deee0-59c4-4fa7-8d8c-e12b516885dc","Type":"ContainerStarted","Data":"dac372c91256bbeabdb4ee95a6241431746c1c91b7bb9f40ca6c3bd206fe1f51"} Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.372807 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" event={"ID":"63688ba3-e68c-4f88-a6e4-6c373b30f929","Type":"ContainerDied","Data":"a94e60840947634f6bf1cb267e8fd71afc4b1db7581a3cc51184314c3d6b19e7"} Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.372955 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.381337 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ls998" event={"ID":"71e00962-6b2f-495c-8f34-52993f66cef9","Type":"ContainerStarted","Data":"5b6e7b77ac7881a09bdbef9dacc95d17e67aaa52c4bc3122591a1f9bfca019c3"} Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.420990 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a250849d-ca15-40fa-8b1d-a32b5abc6861","Type":"ContainerStarted","Data":"7a22f32b80bf3ec02fab7028c9c981153ef89481c11b18583b8c1e3f0c67df24"} Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.423376 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nbhpw" event={"ID":"32dc51d9-5638-4530-91c8-5be8c13e60f3","Type":"ContainerStarted","Data":"0789b60c660687f217e3da05e0adc970132c74550fab068133b05224809ee33f"} Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.431020 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4pw9n"] Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.454272 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4pw9n"] Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.475132 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-29x4s"] Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.483492 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-29x4s"] Mar 13 12:05:18 crc kubenswrapper[4837]: I0313 12:05:18.093584 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 12:05:18 crc kubenswrapper[4837]: I0313 12:05:18.231017 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 12:05:18 crc kubenswrapper[4837]: W0313 12:05:18.360144 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d10fcb0_4d45_45bf_a663_971b8ce74010.slice/crio-8195b06a2da393e3a7948f6a45bf28cc58d86c3113ec50ede9d3d585d004a6f6 WatchSource:0}: Error finding container 8195b06a2da393e3a7948f6a45bf28cc58d86c3113ec50ede9d3d585d004a6f6: Status 404 returned error can't find the container with id 8195b06a2da393e3a7948f6a45bf28cc58d86c3113ec50ede9d3d585d004a6f6 Mar 13 12:05:18 crc kubenswrapper[4837]: W0313 12:05:18.364768 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38d61ffe_3c44_4657_bc91_d849f766a3e1.slice/crio-b7ce67bc082f03585d388dd5a305fcdc2954f5f487cbe3ce865785cd6c8555a4 WatchSource:0}: Error finding container b7ce67bc082f03585d388dd5a305fcdc2954f5f487cbe3ce865785cd6c8555a4: Status 404 returned error can't find the container with id b7ce67bc082f03585d388dd5a305fcdc2954f5f487cbe3ce865785cd6c8555a4 Mar 13 12:05:18 crc kubenswrapper[4837]: I0313 12:05:18.433555 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3d10fcb0-4d45-45bf-a663-971b8ce74010","Type":"ContainerStarted","Data":"8195b06a2da393e3a7948f6a45bf28cc58d86c3113ec50ede9d3d585d004a6f6"} Mar 13 12:05:18 crc kubenswrapper[4837]: I0313 12:05:18.436815 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"38d61ffe-3c44-4657-bc91-d849f766a3e1","Type":"ContainerStarted","Data":"b7ce67bc082f03585d388dd5a305fcdc2954f5f487cbe3ce865785cd6c8555a4"} Mar 13 12:05:19 crc kubenswrapper[4837]: I0313 12:05:19.060030 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b9562f6-0527-40b4-9b2e-f5b2f22aa272" path="/var/lib/kubelet/pods/5b9562f6-0527-40b4-9b2e-f5b2f22aa272/volumes" Mar 13 12:05:19 crc kubenswrapper[4837]: I0313 12:05:19.060567 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63688ba3-e68c-4f88-a6e4-6c373b30f929" path="/var/lib/kubelet/pods/63688ba3-e68c-4f88-a6e4-6c373b30f929/volumes" Mar 13 12:05:25 crc kubenswrapper[4837]: I0313 12:05:25.502228 4837 generic.go:334] "Generic (PLEG): container finished" podID="ae5deee0-59c4-4fa7-8d8c-e12b516885dc" containerID="86bd795421dbb8f853056d0ba23fbd4446b0d4a512f91ff6b5cf999a58e9698a" exitCode=0 Mar 13 12:05:25 crc kubenswrapper[4837]: I0313 12:05:25.502432 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" event={"ID":"ae5deee0-59c4-4fa7-8d8c-e12b516885dc","Type":"ContainerDied","Data":"86bd795421dbb8f853056d0ba23fbd4446b0d4a512f91ff6b5cf999a58e9698a"} Mar 13 12:05:26 crc kubenswrapper[4837]: I0313 12:05:26.511675 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"38d61ffe-3c44-4657-bc91-d849f766a3e1","Type":"ContainerStarted","Data":"9c66450db75ad1df997cb59ba25629075bbdda7b2d722c10e976e13d82921a53"} Mar 13 12:05:26 crc kubenswrapper[4837]: I0313 12:05:26.513181 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"362e31d4-ea62-40ed-8426-982d47559472","Type":"ContainerStarted","Data":"e6271d6852050c2d1ad25293179089e8f2036d9ca9c515d25f1b4682afdad63a"} Mar 13 12:05:26 crc kubenswrapper[4837]: I0313 12:05:26.515530 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" event={"ID":"b082689f-6a6d-4da0-b2b1-f78343ba1e85","Type":"ContainerStarted","Data":"5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c"} Mar 13 12:05:26 crc kubenswrapper[4837]: I0313 12:05:26.515684 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:26 crc kubenswrapper[4837]: I0313 12:05:26.517243 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ls998" event={"ID":"71e00962-6b2f-495c-8f34-52993f66cef9","Type":"ContainerStarted","Data":"12a87353cee06d3c720268e4190ee48375a699885842e4a67424170d2dca396e"} Mar 13 12:05:26 crc kubenswrapper[4837]: I0313 12:05:26.558459 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" podStartSLOduration=24.063680955 podStartE2EDuration="25.558435223s" podCreationTimestamp="2026-03-13 12:05:01 +0000 UTC" firstStartedPulling="2026-03-13 12:05:14.539071641 +0000 UTC m=+1030.177338404" lastFinishedPulling="2026-03-13 12:05:16.033825909 +0000 UTC m=+1031.672092672" observedRunningTime="2026-03-13 12:05:26.557310777 +0000 UTC m=+1042.195577540" watchObservedRunningTime="2026-03-13 12:05:26.558435223 +0000 UTC m=+1042.196701996" Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.541257 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e7b01be4-73b6-48eb-a06d-4fb38863d982","Type":"ContainerStarted","Data":"afe3a88a0e8205fefe122a8099e4acc29a3ebc22c1a9a9cfe3c00f5ab1794007"} Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.545309 4837 generic.go:334] "Generic (PLEG): container finished" podID="71e00962-6b2f-495c-8f34-52993f66cef9" containerID="12a87353cee06d3c720268e4190ee48375a699885842e4a67424170d2dca396e" exitCode=0 Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.545409 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ls998" event={"ID":"71e00962-6b2f-495c-8f34-52993f66cef9","Type":"ContainerDied","Data":"12a87353cee06d3c720268e4190ee48375a699885842e4a67424170d2dca396e"} Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.554338 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd","Type":"ContainerStarted","Data":"42d9d96b2ed8394546063531c8e1be585dbcac59412f38099bec42e73ac4a269"} Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.563075 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" event={"ID":"ae5deee0-59c4-4fa7-8d8c-e12b516885dc","Type":"ContainerStarted","Data":"5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f"} Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.563244 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.574516 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"13254c8b-516c-435e-9db2-a8d518434f29","Type":"ContainerStarted","Data":"d7e3a2439b933c4a76e0a0472aaff3cf352a36e55d6c5a3aa674478c5299b9be"} Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.580482 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3d10fcb0-4d45-45bf-a663-971b8ce74010","Type":"ContainerStarted","Data":"f6f3d45ae4a8f0eb2588879a86f07d1cae21417054813611a749565840b2152d"} Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.593540 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nbhpw" event={"ID":"32dc51d9-5638-4530-91c8-5be8c13e60f3","Type":"ContainerStarted","Data":"fcd880f044b42eae49579b77a948d25c9288848e485da833c17099adb9827f62"} Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.595137 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.597220 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" podStartSLOduration=26.597180556 podStartE2EDuration="26.597180556s" podCreationTimestamp="2026-03-13 12:05:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:05:27.589830033 +0000 UTC m=+1043.228096796" watchObservedRunningTime="2026-03-13 12:05:27.597180556 +0000 UTC m=+1043.235447329" Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.601486 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a250849d-ca15-40fa-8b1d-a32b5abc6861","Type":"ContainerStarted","Data":"07fc1a83feb8d7932c2b80f34ffbd6218ef230bb996e92d9892feae57b23c402"} Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.601978 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.610884 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ae39431b-5fa4-4a09-b76f-44b4d256c129","Type":"ContainerStarted","Data":"57ca39395feb458a8f10d2261da889c29c26f61eed4082b3099aea11d6719d00"} Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.611942 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.674681 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=14.865654186 podStartE2EDuration="22.674626719s" podCreationTimestamp="2026-03-13 12:05:05 +0000 UTC" firstStartedPulling="2026-03-13 12:05:17.155047825 +0000 UTC m=+1032.793314588" lastFinishedPulling="2026-03-13 12:05:24.964020368 +0000 UTC m=+1040.602287121" observedRunningTime="2026-03-13 12:05:27.669680272 +0000 UTC m=+1043.307947035" watchObservedRunningTime="2026-03-13 12:05:27.674626719 +0000 UTC m=+1043.312893482" Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.688870 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.419684323 podStartE2EDuration="19.688844119s" podCreationTimestamp="2026-03-13 12:05:08 +0000 UTC" firstStartedPulling="2026-03-13 12:05:16.471252385 +0000 UTC m=+1032.109519148" lastFinishedPulling="2026-03-13 12:05:25.740412181 +0000 UTC m=+1041.378678944" observedRunningTime="2026-03-13 12:05:27.686988021 +0000 UTC m=+1043.325254784" watchObservedRunningTime="2026-03-13 12:05:27.688844119 +0000 UTC m=+1043.327110882" Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.722382 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-nbhpw" podStartSLOduration=8.816655873 podStartE2EDuration="16.72234785s" podCreationTimestamp="2026-03-13 12:05:11 +0000 UTC" firstStartedPulling="2026-03-13 12:05:17.159817716 +0000 UTC m=+1032.798084479" lastFinishedPulling="2026-03-13 12:05:25.065509693 +0000 UTC m=+1040.703776456" observedRunningTime="2026-03-13 12:05:27.714284025 +0000 UTC m=+1043.352550788" watchObservedRunningTime="2026-03-13 12:05:27.72234785 +0000 UTC m=+1043.360614613" Mar 13 12:05:28 crc kubenswrapper[4837]: I0313 12:05:28.628207 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ls998" event={"ID":"71e00962-6b2f-495c-8f34-52993f66cef9","Type":"ContainerStarted","Data":"04a1193282a85494c2ab05d91c89fc7c4037180e86d5b306a180cc69d1011c14"} Mar 13 12:05:28 crc kubenswrapper[4837]: I0313 12:05:28.628827 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ls998" event={"ID":"71e00962-6b2f-495c-8f34-52993f66cef9","Type":"ContainerStarted","Data":"8ad679f31ac7d1ea967741bb7fc2b12c07ed73d803e94bc78bc90f1333ccac41"} Mar 13 12:05:29 crc kubenswrapper[4837]: I0313 12:05:29.635421 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:29 crc kubenswrapper[4837]: I0313 12:05:29.635473 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:30 crc kubenswrapper[4837]: I0313 12:05:30.656727 4837 generic.go:334] "Generic (PLEG): container finished" podID="adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd" containerID="42d9d96b2ed8394546063531c8e1be585dbcac59412f38099bec42e73ac4a269" exitCode=0 Mar 13 12:05:30 crc kubenswrapper[4837]: I0313 12:05:30.656831 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd","Type":"ContainerDied","Data":"42d9d96b2ed8394546063531c8e1be585dbcac59412f38099bec42e73ac4a269"} Mar 13 12:05:30 crc kubenswrapper[4837]: I0313 12:05:30.662317 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"38d61ffe-3c44-4657-bc91-d849f766a3e1","Type":"ContainerStarted","Data":"79e9c120dab3e4846bad3812fbe3a0b8fd9ee6861488dbae0fbc00248f43dc50"} Mar 13 12:05:30 crc kubenswrapper[4837]: I0313 12:05:30.664609 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3d10fcb0-4d45-45bf-a663-971b8ce74010","Type":"ContainerStarted","Data":"9f39dd85c04bee002aa20f6ebdae34460e35ba660b37d65ea48aa4af4d70b080"} Mar 13 12:05:30 crc kubenswrapper[4837]: I0313 12:05:30.666954 4837 generic.go:334] "Generic (PLEG): container finished" podID="362e31d4-ea62-40ed-8426-982d47559472" containerID="e6271d6852050c2d1ad25293179089e8f2036d9ca9c515d25f1b4682afdad63a" exitCode=0 Mar 13 12:05:30 crc kubenswrapper[4837]: I0313 12:05:30.667598 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"362e31d4-ea62-40ed-8426-982d47559472","Type":"ContainerDied","Data":"e6271d6852050c2d1ad25293179089e8f2036d9ca9c515d25f1b4682afdad63a"} Mar 13 12:05:30 crc kubenswrapper[4837]: I0313 12:05:30.683222 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-ls998" podStartSLOduration=12.016478999 podStartE2EDuration="19.683204907s" podCreationTimestamp="2026-03-13 12:05:11 +0000 UTC" firstStartedPulling="2026-03-13 12:05:17.159926819 +0000 UTC m=+1032.798193602" lastFinishedPulling="2026-03-13 12:05:24.826652747 +0000 UTC m=+1040.464919510" observedRunningTime="2026-03-13 12:05:28.657315886 +0000 UTC m=+1044.295582649" watchObservedRunningTime="2026-03-13 12:05:30.683204907 +0000 UTC m=+1046.321471670" Mar 13 12:05:30 crc kubenswrapper[4837]: I0313 12:05:30.713753 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=5.900969137 podStartE2EDuration="17.713725724s" podCreationTimestamp="2026-03-13 12:05:13 +0000 UTC" firstStartedPulling="2026-03-13 12:05:18.362623155 +0000 UTC m=+1034.000889918" lastFinishedPulling="2026-03-13 12:05:30.175379742 +0000 UTC m=+1045.813646505" observedRunningTime="2026-03-13 12:05:30.708462408 +0000 UTC m=+1046.346729181" watchObservedRunningTime="2026-03-13 12:05:30.713725724 +0000 UTC m=+1046.351992487" Mar 13 12:05:30 crc kubenswrapper[4837]: I0313 12:05:30.741923 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=5.952283422 podStartE2EDuration="17.741895136s" podCreationTimestamp="2026-03-13 12:05:13 +0000 UTC" firstStartedPulling="2026-03-13 12:05:18.367990405 +0000 UTC m=+1034.006257168" lastFinishedPulling="2026-03-13 12:05:30.157602119 +0000 UTC m=+1045.795868882" observedRunningTime="2026-03-13 12:05:30.730106714 +0000 UTC m=+1046.368373477" watchObservedRunningTime="2026-03-13 12:05:30.741895136 +0000 UTC m=+1046.380161899" Mar 13 12:05:31 crc kubenswrapper[4837]: I0313 12:05:31.239290 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 13 12:05:31 crc kubenswrapper[4837]: I0313 12:05:31.675675 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd","Type":"ContainerStarted","Data":"4da61201dc7ec1c5870826529a37d07172ef4f8646e96fa3dc5baf6b06eeb75f"} Mar 13 12:05:31 crc kubenswrapper[4837]: I0313 12:05:31.677395 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"362e31d4-ea62-40ed-8426-982d47559472","Type":"ContainerStarted","Data":"7833797c9eba5aca08144d6c0ccc75cf7c6d31ad43b74cf435cc15ebd56e332f"} Mar 13 12:05:31 crc kubenswrapper[4837]: I0313 12:05:31.697532 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=20.878127273 podStartE2EDuration="28.697511217s" podCreationTimestamp="2026-03-13 12:05:03 +0000 UTC" firstStartedPulling="2026-03-13 12:05:17.14513735 +0000 UTC m=+1032.783404113" lastFinishedPulling="2026-03-13 12:05:24.964521294 +0000 UTC m=+1040.602788057" observedRunningTime="2026-03-13 12:05:31.693915592 +0000 UTC m=+1047.332182375" watchObservedRunningTime="2026-03-13 12:05:31.697511217 +0000 UTC m=+1047.335777980" Mar 13 12:05:31 crc kubenswrapper[4837]: I0313 12:05:31.713876 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=19.025298679 podStartE2EDuration="27.713856744s" podCreationTimestamp="2026-03-13 12:05:04 +0000 UTC" firstStartedPulling="2026-03-13 12:05:16.925361859 +0000 UTC m=+1032.563628622" lastFinishedPulling="2026-03-13 12:05:25.613919924 +0000 UTC m=+1041.252186687" observedRunningTime="2026-03-13 12:05:31.712150881 +0000 UTC m=+1047.350417644" watchObservedRunningTime="2026-03-13 12:05:31.713856744 +0000 UTC m=+1047.352123507" Mar 13 12:05:32 crc kubenswrapper[4837]: I0313 12:05:32.013820 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:32 crc kubenswrapper[4837]: I0313 12:05:32.292589 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:32 crc kubenswrapper[4837]: I0313 12:05:32.349417 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-7wbz7"] Mar 13 12:05:32 crc kubenswrapper[4837]: I0313 12:05:32.684344 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" podUID="b082689f-6a6d-4da0-b2b1-f78343ba1e85" containerName="dnsmasq-dns" containerID="cri-o://5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c" gracePeriod=10 Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.021530 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.063520 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.068779 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.120063 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b082689f-6a6d-4da0-b2b1-f78343ba1e85-dns-svc\") pod \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\" (UID: \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\") " Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.120140 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b082689f-6a6d-4da0-b2b1-f78343ba1e85-config\") pod \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\" (UID: \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\") " Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.120851 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcjgv\" (UniqueName: \"kubernetes.io/projected/b082689f-6a6d-4da0-b2b1-f78343ba1e85-kube-api-access-bcjgv\") pod \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\" (UID: \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\") " Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.165557 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b082689f-6a6d-4da0-b2b1-f78343ba1e85-kube-api-access-bcjgv" (OuterVolumeSpecName: "kube-api-access-bcjgv") pod "b082689f-6a6d-4da0-b2b1-f78343ba1e85" (UID: "b082689f-6a6d-4da0-b2b1-f78343ba1e85"). InnerVolumeSpecName "kube-api-access-bcjgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.204304 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b082689f-6a6d-4da0-b2b1-f78343ba1e85-config" (OuterVolumeSpecName: "config") pod "b082689f-6a6d-4da0-b2b1-f78343ba1e85" (UID: "b082689f-6a6d-4da0-b2b1-f78343ba1e85"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.215426 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b082689f-6a6d-4da0-b2b1-f78343ba1e85-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b082689f-6a6d-4da0-b2b1-f78343ba1e85" (UID: "b082689f-6a6d-4da0-b2b1-f78343ba1e85"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.256265 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.263101 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b082689f-6a6d-4da0-b2b1-f78343ba1e85-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.263141 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcjgv\" (UniqueName: \"kubernetes.io/projected/b082689f-6a6d-4da0-b2b1-f78343ba1e85-kube-api-access-bcjgv\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.263157 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b082689f-6a6d-4da0-b2b1-f78343ba1e85-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.290052 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.692336 4837 generic.go:334] "Generic (PLEG): container finished" podID="b082689f-6a6d-4da0-b2b1-f78343ba1e85" containerID="5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c" exitCode=0 Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.692420 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.693340 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" event={"ID":"b082689f-6a6d-4da0-b2b1-f78343ba1e85","Type":"ContainerDied","Data":"5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c"} Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.693470 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" event={"ID":"b082689f-6a6d-4da0-b2b1-f78343ba1e85","Type":"ContainerDied","Data":"cf45881769a320d80a0b475b78878572d21663b4dd8fafe7ae1c9681a95c4a07"} Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.693585 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.693704 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.693668 4837 scope.go:117] "RemoveContainer" containerID="5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.724529 4837 scope.go:117] "RemoveContainer" containerID="31ac9d4659c535248d95613eed33e384ae2a9e10c0405e3e50bc5460cd46aeea" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.736371 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-7wbz7"] Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.743990 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-7wbz7"] Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.744190 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.744436 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.757394 4837 scope.go:117] "RemoveContainer" containerID="5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c" Mar 13 12:05:33 crc kubenswrapper[4837]: E0313 12:05:33.760115 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c\": container with ID starting with 5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c not found: ID does not exist" containerID="5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.760161 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c"} err="failed to get container status \"5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c\": rpc error: code = NotFound desc = could not find container \"5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c\": container with ID starting with 5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c not found: ID does not exist" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.760188 4837 scope.go:117] "RemoveContainer" containerID="31ac9d4659c535248d95613eed33e384ae2a9e10c0405e3e50bc5460cd46aeea" Mar 13 12:05:33 crc kubenswrapper[4837]: E0313 12:05:33.760496 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31ac9d4659c535248d95613eed33e384ae2a9e10c0405e3e50bc5460cd46aeea\": container with ID starting with 31ac9d4659c535248d95613eed33e384ae2a9e10c0405e3e50bc5460cd46aeea not found: ID does not exist" containerID="31ac9d4659c535248d95613eed33e384ae2a9e10c0405e3e50bc5460cd46aeea" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.760533 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31ac9d4659c535248d95613eed33e384ae2a9e10c0405e3e50bc5460cd46aeea"} err="failed to get container status \"31ac9d4659c535248d95613eed33e384ae2a9e10c0405e3e50bc5460cd46aeea\": rpc error: code = NotFound desc = could not find container \"31ac9d4659c535248d95613eed33e384ae2a9e10c0405e3e50bc5460cd46aeea\": container with ID starting with 31ac9d4659c535248d95613eed33e384ae2a9e10c0405e3e50bc5460cd46aeea not found: ID does not exist" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.005936 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-c7zsw"] Mar 13 12:05:34 crc kubenswrapper[4837]: E0313 12:05:34.006326 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b082689f-6a6d-4da0-b2b1-f78343ba1e85" containerName="dnsmasq-dns" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.006346 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b082689f-6a6d-4da0-b2b1-f78343ba1e85" containerName="dnsmasq-dns" Mar 13 12:05:34 crc kubenswrapper[4837]: E0313 12:05:34.006377 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b082689f-6a6d-4da0-b2b1-f78343ba1e85" containerName="init" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.006384 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b082689f-6a6d-4da0-b2b1-f78343ba1e85" containerName="init" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.006528 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="b082689f-6a6d-4da0-b2b1-f78343ba1e85" containerName="dnsmasq-dns" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.007380 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.013320 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.023895 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-c7zsw"] Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.073427 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tblrl\" (UniqueName: \"kubernetes.io/projected/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-kube-api-access-tblrl\") pod \"dnsmasq-dns-7fd796d7df-c7zsw\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.073494 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-config\") pod \"dnsmasq-dns-7fd796d7df-c7zsw\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.073775 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-c7zsw\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.074044 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-c7zsw\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.175628 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-c7zsw\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.175747 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-c7zsw\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.175790 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tblrl\" (UniqueName: \"kubernetes.io/projected/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-kube-api-access-tblrl\") pod \"dnsmasq-dns-7fd796d7df-c7zsw\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.175827 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-config\") pod \"dnsmasq-dns-7fd796d7df-c7zsw\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.176563 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-c7zsw\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.176584 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-c7zsw\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.176829 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-config\") pod \"dnsmasq-dns-7fd796d7df-c7zsw\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.178673 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-w69p6"] Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.179616 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.183203 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.221463 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tblrl\" (UniqueName: \"kubernetes.io/projected/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-kube-api-access-tblrl\") pod \"dnsmasq-dns-7fd796d7df-c7zsw\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.252205 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-w69p6"] Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.281253 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5g7q\" (UniqueName: \"kubernetes.io/projected/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-kube-api-access-m5g7q\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.281302 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-combined-ca-bundle\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.281349 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-ovn-rundir\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.281371 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-config\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.281403 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-ovs-rundir\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.281443 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.283428 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-c7zsw"] Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.289040 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.321025 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-f6w8l"] Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.322404 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.326235 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.353098 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-f6w8l"] Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.382758 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-combined-ca-bundle\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.382838 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s72rx\" (UniqueName: \"kubernetes.io/projected/177d4af4-1f81-43ff-bcbc-b3d74689452f-kube-api-access-s72rx\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.382886 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-ovn-rundir\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.382912 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-config\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.382950 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-ovs-rundir\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.383007 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-config\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.383033 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.383066 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.383110 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.383167 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.383192 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5g7q\" (UniqueName: \"kubernetes.io/projected/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-kube-api-access-m5g7q\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.384980 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-ovn-rundir\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.385721 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-config\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.385798 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-ovs-rundir\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.393621 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-combined-ca-bundle\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.409228 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.444428 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.445024 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5g7q\" (UniqueName: \"kubernetes.io/projected/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-kube-api-access-m5g7q\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.446210 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.456042 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.464135 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.464357 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.464378 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.464469 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-wbggf" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.484394 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-scripts\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.484435 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.484470 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.484512 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.484543 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-config\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.484571 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s72rx\" (UniqueName: \"kubernetes.io/projected/177d4af4-1f81-43ff-bcbc-b3d74689452f-kube-api-access-s72rx\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.484597 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.484625 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtvhq\" (UniqueName: \"kubernetes.io/projected/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-kube-api-access-vtvhq\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.484727 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-config\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.484750 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.484766 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.484784 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.486099 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.486693 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.488346 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-config\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.488997 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.497190 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.521469 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s72rx\" (UniqueName: \"kubernetes.io/projected/177d4af4-1f81-43ff-bcbc-b3d74689452f-kube-api-access-s72rx\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.587076 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.587224 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.587248 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-scripts\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.587295 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.588702 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-config\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.590229 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-config\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.590550 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-scripts\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.592284 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.592440 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.592839 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.592938 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtvhq\" (UniqueName: \"kubernetes.io/projected/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-kube-api-access-vtvhq\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.593165 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.595686 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.618700 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtvhq\" (UniqueName: \"kubernetes.io/projected/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-kube-api-access-vtvhq\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.745868 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.747881 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.818329 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.854224 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.926876 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-c7zsw"] Mar 13 12:05:35 crc kubenswrapper[4837]: I0313 12:05:35.066512 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b082689f-6a6d-4da0-b2b1-f78343ba1e85" path="/var/lib/kubelet/pods/b082689f-6a6d-4da0-b2b1-f78343ba1e85/volumes" Mar 13 12:05:35 crc kubenswrapper[4837]: I0313 12:05:35.067432 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-w69p6"] Mar 13 12:05:35 crc kubenswrapper[4837]: W0313 12:05:35.288317 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod177d4af4_1f81_43ff_bcbc_b3d74689452f.slice/crio-e5c07a9f609638a177aae5955938d17ec2d0fa96ca0f604f6cfe39e8b5c432a3 WatchSource:0}: Error finding container e5c07a9f609638a177aae5955938d17ec2d0fa96ca0f604f6cfe39e8b5c432a3: Status 404 returned error can't find the container with id e5c07a9f609638a177aae5955938d17ec2d0fa96ca0f604f6cfe39e8b5c432a3 Mar 13 12:05:35 crc kubenswrapper[4837]: I0313 12:05:35.290306 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-f6w8l"] Mar 13 12:05:35 crc kubenswrapper[4837]: I0313 12:05:35.405258 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 12:05:35 crc kubenswrapper[4837]: W0313 12:05:35.411772 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25ea0f5e_e277_4944_8c9d_2c7709e1a8cf.slice/crio-4008525c8372527fc7d833af378cd520d521880ba97e58c049fd34a0eb17fd54 WatchSource:0}: Error finding container 4008525c8372527fc7d833af378cd520d521880ba97e58c049fd34a0eb17fd54: Status 404 returned error can't find the container with id 4008525c8372527fc7d833af378cd520d521880ba97e58c049fd34a0eb17fd54 Mar 13 12:05:35 crc kubenswrapper[4837]: I0313 12:05:35.708913 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" event={"ID":"177d4af4-1f81-43ff-bcbc-b3d74689452f","Type":"ContainerStarted","Data":"e5c07a9f609638a177aae5955938d17ec2d0fa96ca0f604f6cfe39e8b5c432a3"} Mar 13 12:05:35 crc kubenswrapper[4837]: I0313 12:05:35.710072 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf","Type":"ContainerStarted","Data":"4008525c8372527fc7d833af378cd520d521880ba97e58c049fd34a0eb17fd54"} Mar 13 12:05:35 crc kubenswrapper[4837]: I0313 12:05:35.711184 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-w69p6" event={"ID":"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d","Type":"ContainerStarted","Data":"0f115ab9b4ae23f3688aa37ceef85bf3548b247962900d3d66b9e2f0e067e2d5"} Mar 13 12:05:35 crc kubenswrapper[4837]: I0313 12:05:35.712438 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" event={"ID":"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5","Type":"ContainerStarted","Data":"be59ccb857947065a413e020e34f05fc612d0f53e32fd7765216d90d410b005f"} Mar 13 12:05:36 crc kubenswrapper[4837]: I0313 12:05:36.187170 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:36 crc kubenswrapper[4837]: I0313 12:05:36.187848 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:36 crc kubenswrapper[4837]: I0313 12:05:36.741758 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:36 crc kubenswrapper[4837]: I0313 12:05:36.820979 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.442242 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-f6w8l"] Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.442958 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.479028 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-gqrt7"] Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.482832 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.496924 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gqrt7"] Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.559425 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq8p7\" (UniqueName: \"kubernetes.io/projected/de68f8fe-0650-4ef4-9445-d31e119de423-kube-api-access-dq8p7\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.559486 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-config\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.559522 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.559594 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.559764 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-dns-svc\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.661382 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-dns-svc\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.661891 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq8p7\" (UniqueName: \"kubernetes.io/projected/de68f8fe-0650-4ef4-9445-d31e119de423-kube-api-access-dq8p7\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.661941 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-config\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.661977 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.662030 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.662189 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-dns-svc\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.662817 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-config\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.663117 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.663630 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.682075 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq8p7\" (UniqueName: \"kubernetes.io/projected/de68f8fe-0650-4ef4-9445-d31e119de423-kube-api-access-dq8p7\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.736388 4837 generic.go:334] "Generic (PLEG): container finished" podID="3a34d203-ee9d-4cb5-b2a4-64f57334e1e5" containerID="008473fe0498516be934d5e47eeaa2a9839d4bbed34d1703c646c45767593f85" exitCode=0 Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.736442 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" event={"ID":"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5","Type":"ContainerDied","Data":"008473fe0498516be934d5e47eeaa2a9839d4bbed34d1703c646c45767593f85"} Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.739190 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" event={"ID":"177d4af4-1f81-43ff-bcbc-b3d74689452f","Type":"ContainerStarted","Data":"f9c06e52cd86e1b4bf6834ffdc5c8a0fcbac08a96f3424206d1b66cf562bc09a"} Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.740431 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-w69p6" event={"ID":"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d","Type":"ContainerStarted","Data":"e69192ea0ed98994a829f21ad52593d7568b0b12212fa0fc89516fd2ecf81eff"} Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.761068 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-w69p6" podStartSLOduration=4.76104943 podStartE2EDuration="4.76104943s" podCreationTimestamp="2026-03-13 12:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:05:38.756617949 +0000 UTC m=+1054.394884712" watchObservedRunningTime="2026-03-13 12:05:38.76104943 +0000 UTC m=+1054.399316193" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.814260 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.276316 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gqrt7"] Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.583864 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.594543 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.597878 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.598120 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.598250 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-lccz5" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.598250 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.613689 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.678821 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.678903 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmlmq\" (UniqueName: \"kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-kube-api-access-wmlmq\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.678928 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.679072 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/59565710-b9bc-46e6-ad92-7f12376de17c-cache\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.679283 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59565710-b9bc-46e6-ad92-7f12376de17c-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.679372 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/59565710-b9bc-46e6-ad92-7f12376de17c-lock\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.749691 4837 generic.go:334] "Generic (PLEG): container finished" podID="177d4af4-1f81-43ff-bcbc-b3d74689452f" containerID="f9c06e52cd86e1b4bf6834ffdc5c8a0fcbac08a96f3424206d1b66cf562bc09a" exitCode=0 Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.749842 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" event={"ID":"177d4af4-1f81-43ff-bcbc-b3d74689452f","Type":"ContainerDied","Data":"f9c06e52cd86e1b4bf6834ffdc5c8a0fcbac08a96f3424206d1b66cf562bc09a"} Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.752923 4837 generic.go:334] "Generic (PLEG): container finished" podID="de68f8fe-0650-4ef4-9445-d31e119de423" containerID="e03f96aaa50d1c9241f9e2fad6e8df257f1e78642de37d3f89872036b5b55220" exitCode=0 Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.754183 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gqrt7" event={"ID":"de68f8fe-0650-4ef4-9445-d31e119de423","Type":"ContainerDied","Data":"e03f96aaa50d1c9241f9e2fad6e8df257f1e78642de37d3f89872036b5b55220"} Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.754239 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gqrt7" event={"ID":"de68f8fe-0650-4ef4-9445-d31e119de423","Type":"ContainerStarted","Data":"d23a17995d98b2790b62117dc60f3874a46893982c985ce77e930333e0f2f46d"} Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.784925 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: E0313 12:05:39.785062 4837 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 12:05:39 crc kubenswrapper[4837]: E0313 12:05:39.785074 4837 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 12:05:39 crc kubenswrapper[4837]: E0313 12:05:39.785112 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift podName:59565710-b9bc-46e6-ad92-7f12376de17c nodeName:}" failed. No retries permitted until 2026-03-13 12:05:40.285097497 +0000 UTC m=+1055.923364260 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift") pod "swift-storage-0" (UID: "59565710-b9bc-46e6-ad92-7f12376de17c") : configmap "swift-ring-files" not found Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.785300 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmlmq\" (UniqueName: \"kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-kube-api-access-wmlmq\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.785337 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.785384 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/59565710-b9bc-46e6-ad92-7f12376de17c-cache\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.785437 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59565710-b9bc-46e6-ad92-7f12376de17c-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.785466 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/59565710-b9bc-46e6-ad92-7f12376de17c-lock\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.786093 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.786269 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/59565710-b9bc-46e6-ad92-7f12376de17c-cache\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.787612 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/59565710-b9bc-46e6-ad92-7f12376de17c-lock\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.795881 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59565710-b9bc-46e6-ad92-7f12376de17c-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.839500 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmlmq\" (UniqueName: \"kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-kube-api-access-wmlmq\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.849876 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.177337 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.262289 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.302691 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-config\") pod \"177d4af4-1f81-43ff-bcbc-b3d74689452f\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.302766 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-ovsdbserver-sb\") pod \"177d4af4-1f81-43ff-bcbc-b3d74689452f\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.302806 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-dns-svc\") pod \"177d4af4-1f81-43ff-bcbc-b3d74689452f\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.302921 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s72rx\" (UniqueName: \"kubernetes.io/projected/177d4af4-1f81-43ff-bcbc-b3d74689452f-kube-api-access-s72rx\") pod \"177d4af4-1f81-43ff-bcbc-b3d74689452f\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.302973 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-ovsdbserver-nb\") pod \"177d4af4-1f81-43ff-bcbc-b3d74689452f\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.303249 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:40 crc kubenswrapper[4837]: E0313 12:05:40.303430 4837 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 12:05:40 crc kubenswrapper[4837]: E0313 12:05:40.303466 4837 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 12:05:40 crc kubenswrapper[4837]: E0313 12:05:40.303534 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift podName:59565710-b9bc-46e6-ad92-7f12376de17c nodeName:}" failed. No retries permitted until 2026-03-13 12:05:41.303508048 +0000 UTC m=+1056.941774811 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift") pod "swift-storage-0" (UID: "59565710-b9bc-46e6-ad92-7f12376de17c") : configmap "swift-ring-files" not found Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.307546 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/177d4af4-1f81-43ff-bcbc-b3d74689452f-kube-api-access-s72rx" (OuterVolumeSpecName: "kube-api-access-s72rx") pod "177d4af4-1f81-43ff-bcbc-b3d74689452f" (UID: "177d4af4-1f81-43ff-bcbc-b3d74689452f"). InnerVolumeSpecName "kube-api-access-s72rx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.323047 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "177d4af4-1f81-43ff-bcbc-b3d74689452f" (UID: "177d4af4-1f81-43ff-bcbc-b3d74689452f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.323653 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "177d4af4-1f81-43ff-bcbc-b3d74689452f" (UID: "177d4af4-1f81-43ff-bcbc-b3d74689452f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.325220 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-config" (OuterVolumeSpecName: "config") pod "177d4af4-1f81-43ff-bcbc-b3d74689452f" (UID: "177d4af4-1f81-43ff-bcbc-b3d74689452f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.326745 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "177d4af4-1f81-43ff-bcbc-b3d74689452f" (UID: "177d4af4-1f81-43ff-bcbc-b3d74689452f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.404166 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-dns-svc\") pod \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.404509 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tblrl\" (UniqueName: \"kubernetes.io/projected/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-kube-api-access-tblrl\") pod \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.404573 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-config\") pod \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.404631 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-ovsdbserver-nb\") pod \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.405004 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.405020 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.405029 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.405037 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.405046 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s72rx\" (UniqueName: \"kubernetes.io/projected/177d4af4-1f81-43ff-bcbc-b3d74689452f-kube-api-access-s72rx\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.407226 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-kube-api-access-tblrl" (OuterVolumeSpecName: "kube-api-access-tblrl") pod "3a34d203-ee9d-4cb5-b2a4-64f57334e1e5" (UID: "3a34d203-ee9d-4cb5-b2a4-64f57334e1e5"). InnerVolumeSpecName "kube-api-access-tblrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.420082 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-config" (OuterVolumeSpecName: "config") pod "3a34d203-ee9d-4cb5-b2a4-64f57334e1e5" (UID: "3a34d203-ee9d-4cb5-b2a4-64f57334e1e5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.422069 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3a34d203-ee9d-4cb5-b2a4-64f57334e1e5" (UID: "3a34d203-ee9d-4cb5-b2a4-64f57334e1e5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.422874 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3a34d203-ee9d-4cb5-b2a4-64f57334e1e5" (UID: "3a34d203-ee9d-4cb5-b2a4-64f57334e1e5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.506787 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tblrl\" (UniqueName: \"kubernetes.io/projected/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-kube-api-access-tblrl\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.506848 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.506865 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.506878 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.763052 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf","Type":"ContainerStarted","Data":"60b408b07a440275036d25737bc6cd3f4e00346e306c3b99863ad3c2d758fb25"} Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.763100 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf","Type":"ContainerStarted","Data":"ab64c3706e783fd7321b2f1fb1e02d3f494cfa6379c035375e2ab370a6d3a514"} Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.763805 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.766665 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gqrt7" event={"ID":"de68f8fe-0650-4ef4-9445-d31e119de423","Type":"ContainerStarted","Data":"a3d9d75be9f89d9ac614473e4e3a4f535965320bd55937576eb6b69f6cb8f8b9"} Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.766808 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.772497 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.772668 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" event={"ID":"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5","Type":"ContainerDied","Data":"be59ccb857947065a413e020e34f05fc612d0f53e32fd7765216d90d410b005f"} Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.772734 4837 scope.go:117] "RemoveContainer" containerID="008473fe0498516be934d5e47eeaa2a9839d4bbed34d1703c646c45767593f85" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.775005 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" event={"ID":"177d4af4-1f81-43ff-bcbc-b3d74689452f","Type":"ContainerDied","Data":"e5c07a9f609638a177aae5955938d17ec2d0fa96ca0f604f6cfe39e8b5c432a3"} Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.775249 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.797086 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.481699651 podStartE2EDuration="6.797068333s" podCreationTimestamp="2026-03-13 12:05:34 +0000 UTC" firstStartedPulling="2026-03-13 12:05:35.416188369 +0000 UTC m=+1051.054455132" lastFinishedPulling="2026-03-13 12:05:39.731557051 +0000 UTC m=+1055.369823814" observedRunningTime="2026-03-13 12:05:40.793976035 +0000 UTC m=+1056.432242808" watchObservedRunningTime="2026-03-13 12:05:40.797068333 +0000 UTC m=+1056.435335096" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.816889 4837 scope.go:117] "RemoveContainer" containerID="f9c06e52cd86e1b4bf6834ffdc5c8a0fcbac08a96f3424206d1b66cf562bc09a" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.818072 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-gqrt7" podStartSLOduration=2.817815409 podStartE2EDuration="2.817815409s" podCreationTimestamp="2026-03-13 12:05:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:05:40.812238053 +0000 UTC m=+1056.450504816" watchObservedRunningTime="2026-03-13 12:05:40.817815409 +0000 UTC m=+1056.456082172" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.852091 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.911207 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-c7zsw"] Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.922191 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-c7zsw"] Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.968448 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-f6w8l"] Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.980817 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-f6w8l"] Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.017530 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.057807 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="177d4af4-1f81-43ff-bcbc-b3d74689452f" path="/var/lib/kubelet/pods/177d4af4-1f81-43ff-bcbc-b3d74689452f/volumes" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.058488 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a34d203-ee9d-4cb5-b2a4-64f57334e1e5" path="/var/lib/kubelet/pods/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5/volumes" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.319195 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:41 crc kubenswrapper[4837]: E0313 12:05:41.319375 4837 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 12:05:41 crc kubenswrapper[4837]: E0313 12:05:41.319577 4837 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 12:05:41 crc kubenswrapper[4837]: E0313 12:05:41.319706 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift podName:59565710-b9bc-46e6-ad92-7f12376de17c nodeName:}" failed. No retries permitted until 2026-03-13 12:05:43.319618134 +0000 UTC m=+1058.957884897 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift") pod "swift-storage-0" (UID: "59565710-b9bc-46e6-ad92-7f12376de17c") : configmap "swift-ring-files" not found Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.495504 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-a48b-account-create-update-ckblt"] Mar 13 12:05:41 crc kubenswrapper[4837]: E0313 12:05:41.495899 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a34d203-ee9d-4cb5-b2a4-64f57334e1e5" containerName="init" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.495916 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a34d203-ee9d-4cb5-b2a4-64f57334e1e5" containerName="init" Mar 13 12:05:41 crc kubenswrapper[4837]: E0313 12:05:41.495975 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="177d4af4-1f81-43ff-bcbc-b3d74689452f" containerName="init" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.495984 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="177d4af4-1f81-43ff-bcbc-b3d74689452f" containerName="init" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.496149 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="177d4af4-1f81-43ff-bcbc-b3d74689452f" containerName="init" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.496165 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a34d203-ee9d-4cb5-b2a4-64f57334e1e5" containerName="init" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.496699 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a48b-account-create-update-ckblt" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.507456 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a48b-account-create-update-ckblt"] Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.509511 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.535415 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-n42jz"] Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.536760 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n42jz" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.548920 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-n42jz"] Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.623692 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n7fz\" (UniqueName: \"kubernetes.io/projected/f2936dcb-f1fa-446b-b20f-87e09a9c03ee-kube-api-access-6n7fz\") pod \"glance-a48b-account-create-update-ckblt\" (UID: \"f2936dcb-f1fa-446b-b20f-87e09a9c03ee\") " pod="openstack/glance-a48b-account-create-update-ckblt" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.623750 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2936dcb-f1fa-446b-b20f-87e09a9c03ee-operator-scripts\") pod \"glance-a48b-account-create-update-ckblt\" (UID: \"f2936dcb-f1fa-446b-b20f-87e09a9c03ee\") " pod="openstack/glance-a48b-account-create-update-ckblt" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.623886 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28320b08-9dde-491d-b151-21f93395bf10-operator-scripts\") pod \"glance-db-create-n42jz\" (UID: \"28320b08-9dde-491d-b151-21f93395bf10\") " pod="openstack/glance-db-create-n42jz" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.623926 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px6d4\" (UniqueName: \"kubernetes.io/projected/28320b08-9dde-491d-b151-21f93395bf10-kube-api-access-px6d4\") pod \"glance-db-create-n42jz\" (UID: \"28320b08-9dde-491d-b151-21f93395bf10\") " pod="openstack/glance-db-create-n42jz" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.725173 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28320b08-9dde-491d-b151-21f93395bf10-operator-scripts\") pod \"glance-db-create-n42jz\" (UID: \"28320b08-9dde-491d-b151-21f93395bf10\") " pod="openstack/glance-db-create-n42jz" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.725247 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px6d4\" (UniqueName: \"kubernetes.io/projected/28320b08-9dde-491d-b151-21f93395bf10-kube-api-access-px6d4\") pod \"glance-db-create-n42jz\" (UID: \"28320b08-9dde-491d-b151-21f93395bf10\") " pod="openstack/glance-db-create-n42jz" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.725319 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n7fz\" (UniqueName: \"kubernetes.io/projected/f2936dcb-f1fa-446b-b20f-87e09a9c03ee-kube-api-access-6n7fz\") pod \"glance-a48b-account-create-update-ckblt\" (UID: \"f2936dcb-f1fa-446b-b20f-87e09a9c03ee\") " pod="openstack/glance-a48b-account-create-update-ckblt" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.725375 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2936dcb-f1fa-446b-b20f-87e09a9c03ee-operator-scripts\") pod \"glance-a48b-account-create-update-ckblt\" (UID: \"f2936dcb-f1fa-446b-b20f-87e09a9c03ee\") " pod="openstack/glance-a48b-account-create-update-ckblt" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.725838 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28320b08-9dde-491d-b151-21f93395bf10-operator-scripts\") pod \"glance-db-create-n42jz\" (UID: \"28320b08-9dde-491d-b151-21f93395bf10\") " pod="openstack/glance-db-create-n42jz" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.726747 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2936dcb-f1fa-446b-b20f-87e09a9c03ee-operator-scripts\") pod \"glance-a48b-account-create-update-ckblt\" (UID: \"f2936dcb-f1fa-446b-b20f-87e09a9c03ee\") " pod="openstack/glance-a48b-account-create-update-ckblt" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.744021 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px6d4\" (UniqueName: \"kubernetes.io/projected/28320b08-9dde-491d-b151-21f93395bf10-kube-api-access-px6d4\") pod \"glance-db-create-n42jz\" (UID: \"28320b08-9dde-491d-b151-21f93395bf10\") " pod="openstack/glance-db-create-n42jz" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.749215 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n7fz\" (UniqueName: \"kubernetes.io/projected/f2936dcb-f1fa-446b-b20f-87e09a9c03ee-kube-api-access-6n7fz\") pod \"glance-a48b-account-create-update-ckblt\" (UID: \"f2936dcb-f1fa-446b-b20f-87e09a9c03ee\") " pod="openstack/glance-a48b-account-create-update-ckblt" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.812100 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a48b-account-create-update-ckblt" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.858295 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n42jz" Mar 13 12:05:42 crc kubenswrapper[4837]: I0313 12:05:42.266323 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a48b-account-create-update-ckblt"] Mar 13 12:05:42 crc kubenswrapper[4837]: W0313 12:05:42.270298 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2936dcb_f1fa_446b_b20f_87e09a9c03ee.slice/crio-930fa6fd842bef07ab99729d63d9c467c183629e38c0c3b6e35eacef428e4c49 WatchSource:0}: Error finding container 930fa6fd842bef07ab99729d63d9c467c183629e38c0c3b6e35eacef428e4c49: Status 404 returned error can't find the container with id 930fa6fd842bef07ab99729d63d9c467c183629e38c0c3b6e35eacef428e4c49 Mar 13 12:05:42 crc kubenswrapper[4837]: W0313 12:05:42.389310 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28320b08_9dde_491d_b151_21f93395bf10.slice/crio-79f305c056052302aa6aedb9477a994fc2b391f952b1de999fb2cdfcefd3e773 WatchSource:0}: Error finding container 79f305c056052302aa6aedb9477a994fc2b391f952b1de999fb2cdfcefd3e773: Status 404 returned error can't find the container with id 79f305c056052302aa6aedb9477a994fc2b391f952b1de999fb2cdfcefd3e773 Mar 13 12:05:42 crc kubenswrapper[4837]: I0313 12:05:42.393309 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-n42jz"] Mar 13 12:05:42 crc kubenswrapper[4837]: I0313 12:05:42.795199 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n42jz" event={"ID":"28320b08-9dde-491d-b151-21f93395bf10","Type":"ContainerStarted","Data":"9c444d34c403a2440618afe6e0c75ef9551c465f012f8ba4f50c5bde9744bb16"} Mar 13 12:05:42 crc kubenswrapper[4837]: I0313 12:05:42.795252 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n42jz" event={"ID":"28320b08-9dde-491d-b151-21f93395bf10","Type":"ContainerStarted","Data":"79f305c056052302aa6aedb9477a994fc2b391f952b1de999fb2cdfcefd3e773"} Mar 13 12:05:42 crc kubenswrapper[4837]: I0313 12:05:42.796912 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a48b-account-create-update-ckblt" event={"ID":"f2936dcb-f1fa-446b-b20f-87e09a9c03ee","Type":"ContainerStarted","Data":"4f6fb24113c34cb08d7bf34817309c7c27eeab0cdaee4f12683e138394d254b1"} Mar 13 12:05:42 crc kubenswrapper[4837]: I0313 12:05:42.796958 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a48b-account-create-update-ckblt" event={"ID":"f2936dcb-f1fa-446b-b20f-87e09a9c03ee","Type":"ContainerStarted","Data":"930fa6fd842bef07ab99729d63d9c467c183629e38c0c3b6e35eacef428e4c49"} Mar 13 12:05:42 crc kubenswrapper[4837]: I0313 12:05:42.817726 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-a48b-account-create-update-ckblt" podStartSLOduration=1.817707908 podStartE2EDuration="1.817707908s" podCreationTimestamp="2026-03-13 12:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:05:42.816373735 +0000 UTC m=+1058.454640498" watchObservedRunningTime="2026-03-13 12:05:42.817707908 +0000 UTC m=+1058.455974681" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.371945 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:43 crc kubenswrapper[4837]: E0313 12:05:43.372264 4837 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 12:05:43 crc kubenswrapper[4837]: E0313 12:05:43.372321 4837 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 12:05:43 crc kubenswrapper[4837]: E0313 12:05:43.372410 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift podName:59565710-b9bc-46e6-ad92-7f12376de17c nodeName:}" failed. No retries permitted until 2026-03-13 12:05:47.372382558 +0000 UTC m=+1063.010649311 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift") pod "swift-storage-0" (UID: "59565710-b9bc-46e6-ad92-7f12376de17c") : configmap "swift-ring-files" not found Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.389751 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-tgg8d"] Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.391158 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tgg8d" Mar 13 12:05:43 crc kubenswrapper[4837]: W0313 12:05:43.393520 4837 reflector.go:561] object-"openstack"/"openstack-mariadb-root-db-secret": failed to list *v1.Secret: secrets "openstack-mariadb-root-db-secret" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Mar 13 12:05:43 crc kubenswrapper[4837]: E0313 12:05:43.393591 4837 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"openstack-mariadb-root-db-secret\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openstack-mariadb-root-db-secret\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.409314 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-tgg8d"] Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.473731 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw8n4\" (UniqueName: \"kubernetes.io/projected/b2c8bb14-f7ed-4a97-a6f8-73f67824897e-kube-api-access-vw8n4\") pod \"root-account-create-update-tgg8d\" (UID: \"b2c8bb14-f7ed-4a97-a6f8-73f67824897e\") " pod="openstack/root-account-create-update-tgg8d" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.474118 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2c8bb14-f7ed-4a97-a6f8-73f67824897e-operator-scripts\") pod \"root-account-create-update-tgg8d\" (UID: \"b2c8bb14-f7ed-4a97-a6f8-73f67824897e\") " pod="openstack/root-account-create-update-tgg8d" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.576240 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2c8bb14-f7ed-4a97-a6f8-73f67824897e-operator-scripts\") pod \"root-account-create-update-tgg8d\" (UID: \"b2c8bb14-f7ed-4a97-a6f8-73f67824897e\") " pod="openstack/root-account-create-update-tgg8d" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.576354 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw8n4\" (UniqueName: \"kubernetes.io/projected/b2c8bb14-f7ed-4a97-a6f8-73f67824897e-kube-api-access-vw8n4\") pod \"root-account-create-update-tgg8d\" (UID: \"b2c8bb14-f7ed-4a97-a6f8-73f67824897e\") " pod="openstack/root-account-create-update-tgg8d" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.577085 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2c8bb14-f7ed-4a97-a6f8-73f67824897e-operator-scripts\") pod \"root-account-create-update-tgg8d\" (UID: \"b2c8bb14-f7ed-4a97-a6f8-73f67824897e\") " pod="openstack/root-account-create-update-tgg8d" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.588172 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-69xgx"] Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.589185 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.591281 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.591292 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.594145 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.603360 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw8n4\" (UniqueName: \"kubernetes.io/projected/b2c8bb14-f7ed-4a97-a6f8-73f67824897e-kube-api-access-vw8n4\") pod \"root-account-create-update-tgg8d\" (UID: \"b2c8bb14-f7ed-4a97-a6f8-73f67824897e\") " pod="openstack/root-account-create-update-tgg8d" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.619242 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-69xgx"] Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.678290 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcxh5\" (UniqueName: \"kubernetes.io/projected/24998567-afa6-4adc-a503-4fc054946aef-kube-api-access-pcxh5\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.678349 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/24998567-afa6-4adc-a503-4fc054946aef-etc-swift\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.678447 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/24998567-afa6-4adc-a503-4fc054946aef-ring-data-devices\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.678521 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-combined-ca-bundle\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.678571 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-dispersionconf\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.678629 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24998567-afa6-4adc-a503-4fc054946aef-scripts\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.678709 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-swiftconf\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.708401 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tgg8d" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.780455 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcxh5\" (UniqueName: \"kubernetes.io/projected/24998567-afa6-4adc-a503-4fc054946aef-kube-api-access-pcxh5\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.780533 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/24998567-afa6-4adc-a503-4fc054946aef-etc-swift\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.780674 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/24998567-afa6-4adc-a503-4fc054946aef-ring-data-devices\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.780709 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-combined-ca-bundle\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.780757 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-dispersionconf\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.780791 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24998567-afa6-4adc-a503-4fc054946aef-scripts\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.780825 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-swiftconf\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.781970 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/24998567-afa6-4adc-a503-4fc054946aef-etc-swift\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.782551 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24998567-afa6-4adc-a503-4fc054946aef-scripts\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.782935 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/24998567-afa6-4adc-a503-4fc054946aef-ring-data-devices\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.784598 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-combined-ca-bundle\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.784689 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-swiftconf\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.785241 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-dispersionconf\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.803318 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcxh5\" (UniqueName: \"kubernetes.io/projected/24998567-afa6-4adc-a503-4fc054946aef-kube-api-access-pcxh5\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.818231 4837 generic.go:334] "Generic (PLEG): container finished" podID="28320b08-9dde-491d-b151-21f93395bf10" containerID="9c444d34c403a2440618afe6e0c75ef9551c465f012f8ba4f50c5bde9744bb16" exitCode=0 Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.819198 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n42jz" event={"ID":"28320b08-9dde-491d-b151-21f93395bf10","Type":"ContainerDied","Data":"9c444d34c403a2440618afe6e0c75ef9551c465f012f8ba4f50c5bde9744bb16"} Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.952932 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:44 crc kubenswrapper[4837]: I0313 12:05:44.155808 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-tgg8d"] Mar 13 12:05:44 crc kubenswrapper[4837]: W0313 12:05:44.158869 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2c8bb14_f7ed_4a97_a6f8_73f67824897e.slice/crio-8d935243c5f433e3064f23569f68df8366157f2b6f6ece0f72a4fdf38d92f739 WatchSource:0}: Error finding container 8d935243c5f433e3064f23569f68df8366157f2b6f6ece0f72a4fdf38d92f739: Status 404 returned error can't find the container with id 8d935243c5f433e3064f23569f68df8366157f2b6f6ece0f72a4fdf38d92f739 Mar 13 12:05:44 crc kubenswrapper[4837]: I0313 12:05:44.370411 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-69xgx"] Mar 13 12:05:44 crc kubenswrapper[4837]: W0313 12:05:44.380229 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24998567_afa6_4adc_a503_4fc054946aef.slice/crio-1ed0808ad30fc86e0f532b5d60900d3ad34dad8095b22ae074e18418eb961f7b WatchSource:0}: Error finding container 1ed0808ad30fc86e0f532b5d60900d3ad34dad8095b22ae074e18418eb961f7b: Status 404 returned error can't find the container with id 1ed0808ad30fc86e0f532b5d60900d3ad34dad8095b22ae074e18418eb961f7b Mar 13 12:05:44 crc kubenswrapper[4837]: I0313 12:05:44.426491 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 13 12:05:44 crc kubenswrapper[4837]: I0313 12:05:44.827276 4837 generic.go:334] "Generic (PLEG): container finished" podID="b2c8bb14-f7ed-4a97-a6f8-73f67824897e" containerID="57b8ae831c66c62748afbdcfeed21457125293d241eef5e2c9e04fa2bc86f046" exitCode=0 Mar 13 12:05:44 crc kubenswrapper[4837]: I0313 12:05:44.827357 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tgg8d" event={"ID":"b2c8bb14-f7ed-4a97-a6f8-73f67824897e","Type":"ContainerDied","Data":"57b8ae831c66c62748afbdcfeed21457125293d241eef5e2c9e04fa2bc86f046"} Mar 13 12:05:44 crc kubenswrapper[4837]: I0313 12:05:44.827604 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tgg8d" event={"ID":"b2c8bb14-f7ed-4a97-a6f8-73f67824897e","Type":"ContainerStarted","Data":"8d935243c5f433e3064f23569f68df8366157f2b6f6ece0f72a4fdf38d92f739"} Mar 13 12:05:44 crc kubenswrapper[4837]: I0313 12:05:44.829867 4837 generic.go:334] "Generic (PLEG): container finished" podID="f2936dcb-f1fa-446b-b20f-87e09a9c03ee" containerID="4f6fb24113c34cb08d7bf34817309c7c27eeab0cdaee4f12683e138394d254b1" exitCode=0 Mar 13 12:05:44 crc kubenswrapper[4837]: I0313 12:05:44.830012 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a48b-account-create-update-ckblt" event={"ID":"f2936dcb-f1fa-446b-b20f-87e09a9c03ee","Type":"ContainerDied","Data":"4f6fb24113c34cb08d7bf34817309c7c27eeab0cdaee4f12683e138394d254b1"} Mar 13 12:05:44 crc kubenswrapper[4837]: I0313 12:05:44.831460 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-69xgx" event={"ID":"24998567-afa6-4adc-a503-4fc054946aef","Type":"ContainerStarted","Data":"1ed0808ad30fc86e0f532b5d60900d3ad34dad8095b22ae074e18418eb961f7b"} Mar 13 12:05:45 crc kubenswrapper[4837]: I0313 12:05:45.213200 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n42jz" Mar 13 12:05:45 crc kubenswrapper[4837]: I0313 12:05:45.328144 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28320b08-9dde-491d-b151-21f93395bf10-operator-scripts\") pod \"28320b08-9dde-491d-b151-21f93395bf10\" (UID: \"28320b08-9dde-491d-b151-21f93395bf10\") " Mar 13 12:05:45 crc kubenswrapper[4837]: I0313 12:05:45.328480 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px6d4\" (UniqueName: \"kubernetes.io/projected/28320b08-9dde-491d-b151-21f93395bf10-kube-api-access-px6d4\") pod \"28320b08-9dde-491d-b151-21f93395bf10\" (UID: \"28320b08-9dde-491d-b151-21f93395bf10\") " Mar 13 12:05:45 crc kubenswrapper[4837]: I0313 12:05:45.328905 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28320b08-9dde-491d-b151-21f93395bf10-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "28320b08-9dde-491d-b151-21f93395bf10" (UID: "28320b08-9dde-491d-b151-21f93395bf10"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:45 crc kubenswrapper[4837]: I0313 12:05:45.329332 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28320b08-9dde-491d-b151-21f93395bf10-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:45 crc kubenswrapper[4837]: I0313 12:05:45.335310 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28320b08-9dde-491d-b151-21f93395bf10-kube-api-access-px6d4" (OuterVolumeSpecName: "kube-api-access-px6d4") pod "28320b08-9dde-491d-b151-21f93395bf10" (UID: "28320b08-9dde-491d-b151-21f93395bf10"). InnerVolumeSpecName "kube-api-access-px6d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:45 crc kubenswrapper[4837]: I0313 12:05:45.431835 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px6d4\" (UniqueName: \"kubernetes.io/projected/28320b08-9dde-491d-b151-21f93395bf10-kube-api-access-px6d4\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:45 crc kubenswrapper[4837]: I0313 12:05:45.838718 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n42jz" Mar 13 12:05:45 crc kubenswrapper[4837]: I0313 12:05:45.840477 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n42jz" event={"ID":"28320b08-9dde-491d-b151-21f93395bf10","Type":"ContainerDied","Data":"79f305c056052302aa6aedb9477a994fc2b391f952b1de999fb2cdfcefd3e773"} Mar 13 12:05:45 crc kubenswrapper[4837]: I0313 12:05:45.840513 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79f305c056052302aa6aedb9477a994fc2b391f952b1de999fb2cdfcefd3e773" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.729921 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a48b-account-create-update-ckblt" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.739525 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tgg8d" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.849274 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tgg8d" event={"ID":"b2c8bb14-f7ed-4a97-a6f8-73f67824897e","Type":"ContainerDied","Data":"8d935243c5f433e3064f23569f68df8366157f2b6f6ece0f72a4fdf38d92f739"} Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.849313 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d935243c5f433e3064f23569f68df8366157f2b6f6ece0f72a4fdf38d92f739" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.849362 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tgg8d" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.851748 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a48b-account-create-update-ckblt" event={"ID":"f2936dcb-f1fa-446b-b20f-87e09a9c03ee","Type":"ContainerDied","Data":"930fa6fd842bef07ab99729d63d9c467c183629e38c0c3b6e35eacef428e4c49"} Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.851777 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="930fa6fd842bef07ab99729d63d9c467c183629e38c0c3b6e35eacef428e4c49" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.852104 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a48b-account-create-update-ckblt" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.867460 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n7fz\" (UniqueName: \"kubernetes.io/projected/f2936dcb-f1fa-446b-b20f-87e09a9c03ee-kube-api-access-6n7fz\") pod \"f2936dcb-f1fa-446b-b20f-87e09a9c03ee\" (UID: \"f2936dcb-f1fa-446b-b20f-87e09a9c03ee\") " Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.867780 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw8n4\" (UniqueName: \"kubernetes.io/projected/b2c8bb14-f7ed-4a97-a6f8-73f67824897e-kube-api-access-vw8n4\") pod \"b2c8bb14-f7ed-4a97-a6f8-73f67824897e\" (UID: \"b2c8bb14-f7ed-4a97-a6f8-73f67824897e\") " Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.867879 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2936dcb-f1fa-446b-b20f-87e09a9c03ee-operator-scripts\") pod \"f2936dcb-f1fa-446b-b20f-87e09a9c03ee\" (UID: \"f2936dcb-f1fa-446b-b20f-87e09a9c03ee\") " Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.867940 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2c8bb14-f7ed-4a97-a6f8-73f67824897e-operator-scripts\") pod \"b2c8bb14-f7ed-4a97-a6f8-73f67824897e\" (UID: \"b2c8bb14-f7ed-4a97-a6f8-73f67824897e\") " Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.869069 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2936dcb-f1fa-446b-b20f-87e09a9c03ee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f2936dcb-f1fa-446b-b20f-87e09a9c03ee" (UID: "f2936dcb-f1fa-446b-b20f-87e09a9c03ee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.869130 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2c8bb14-f7ed-4a97-a6f8-73f67824897e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b2c8bb14-f7ed-4a97-a6f8-73f67824897e" (UID: "b2c8bb14-f7ed-4a97-a6f8-73f67824897e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.874686 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2936dcb-f1fa-446b-b20f-87e09a9c03ee-kube-api-access-6n7fz" (OuterVolumeSpecName: "kube-api-access-6n7fz") pod "f2936dcb-f1fa-446b-b20f-87e09a9c03ee" (UID: "f2936dcb-f1fa-446b-b20f-87e09a9c03ee"). InnerVolumeSpecName "kube-api-access-6n7fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.876790 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2c8bb14-f7ed-4a97-a6f8-73f67824897e-kube-api-access-vw8n4" (OuterVolumeSpecName: "kube-api-access-vw8n4") pod "b2c8bb14-f7ed-4a97-a6f8-73f67824897e" (UID: "b2c8bb14-f7ed-4a97-a6f8-73f67824897e"). InnerVolumeSpecName "kube-api-access-vw8n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.969664 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2936dcb-f1fa-446b-b20f-87e09a9c03ee-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.969707 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2c8bb14-f7ed-4a97-a6f8-73f67824897e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.969721 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n7fz\" (UniqueName: \"kubernetes.io/projected/f2936dcb-f1fa-446b-b20f-87e09a9c03ee-kube-api-access-6n7fz\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.969736 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw8n4\" (UniqueName: \"kubernetes.io/projected/b2c8bb14-f7ed-4a97-a6f8-73f67824897e-kube-api-access-vw8n4\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.305971 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-gmczg"] Mar 13 12:05:47 crc kubenswrapper[4837]: E0313 12:05:47.306322 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2936dcb-f1fa-446b-b20f-87e09a9c03ee" containerName="mariadb-account-create-update" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.306338 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2936dcb-f1fa-446b-b20f-87e09a9c03ee" containerName="mariadb-account-create-update" Mar 13 12:05:47 crc kubenswrapper[4837]: E0313 12:05:47.306378 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2c8bb14-f7ed-4a97-a6f8-73f67824897e" containerName="mariadb-account-create-update" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.306391 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c8bb14-f7ed-4a97-a6f8-73f67824897e" containerName="mariadb-account-create-update" Mar 13 12:05:47 crc kubenswrapper[4837]: E0313 12:05:47.306409 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28320b08-9dde-491d-b151-21f93395bf10" containerName="mariadb-database-create" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.306417 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="28320b08-9dde-491d-b151-21f93395bf10" containerName="mariadb-database-create" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.306618 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2c8bb14-f7ed-4a97-a6f8-73f67824897e" containerName="mariadb-account-create-update" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.306658 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="28320b08-9dde-491d-b151-21f93395bf10" containerName="mariadb-database-create" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.306672 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2936dcb-f1fa-446b-b20f-87e09a9c03ee" containerName="mariadb-account-create-update" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.307191 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gmczg" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.327701 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-gmczg"] Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.376576 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j7s8\" (UniqueName: \"kubernetes.io/projected/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e-kube-api-access-9j7s8\") pod \"keystone-db-create-gmczg\" (UID: \"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e\") " pod="openstack/keystone-db-create-gmczg" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.377251 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e-operator-scripts\") pod \"keystone-db-create-gmczg\" (UID: \"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e\") " pod="openstack/keystone-db-create-gmczg" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.377490 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:47 crc kubenswrapper[4837]: E0313 12:05:47.377932 4837 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 12:05:47 crc kubenswrapper[4837]: E0313 12:05:47.377953 4837 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 12:05:47 crc kubenswrapper[4837]: E0313 12:05:47.378047 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift podName:59565710-b9bc-46e6-ad92-7f12376de17c nodeName:}" failed. No retries permitted until 2026-03-13 12:05:55.378027569 +0000 UTC m=+1071.016294332 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift") pod "swift-storage-0" (UID: "59565710-b9bc-46e6-ad92-7f12376de17c") : configmap "swift-ring-files" not found Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.411834 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d970-account-create-update-lkc7z"] Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.413457 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d970-account-create-update-lkc7z" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.416079 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.423329 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d970-account-create-update-lkc7z"] Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.479958 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fwgn\" (UniqueName: \"kubernetes.io/projected/2230cdcb-087e-4882-8aea-c5d850b711ac-kube-api-access-9fwgn\") pod \"keystone-d970-account-create-update-lkc7z\" (UID: \"2230cdcb-087e-4882-8aea-c5d850b711ac\") " pod="openstack/keystone-d970-account-create-update-lkc7z" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.480043 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e-operator-scripts\") pod \"keystone-db-create-gmczg\" (UID: \"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e\") " pod="openstack/keystone-db-create-gmczg" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.480135 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2230cdcb-087e-4882-8aea-c5d850b711ac-operator-scripts\") pod \"keystone-d970-account-create-update-lkc7z\" (UID: \"2230cdcb-087e-4882-8aea-c5d850b711ac\") " pod="openstack/keystone-d970-account-create-update-lkc7z" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.480229 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j7s8\" (UniqueName: \"kubernetes.io/projected/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e-kube-api-access-9j7s8\") pod \"keystone-db-create-gmczg\" (UID: \"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e\") " pod="openstack/keystone-db-create-gmczg" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.490592 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e-operator-scripts\") pod \"keystone-db-create-gmczg\" (UID: \"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e\") " pod="openstack/keystone-db-create-gmczg" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.516407 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j7s8\" (UniqueName: \"kubernetes.io/projected/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e-kube-api-access-9j7s8\") pod \"keystone-db-create-gmczg\" (UID: \"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e\") " pod="openstack/keystone-db-create-gmczg" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.539070 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-rb248"] Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.540528 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rb248" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.587333 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2230cdcb-087e-4882-8aea-c5d850b711ac-operator-scripts\") pod \"keystone-d970-account-create-update-lkc7z\" (UID: \"2230cdcb-087e-4882-8aea-c5d850b711ac\") " pod="openstack/keystone-d970-account-create-update-lkc7z" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.587873 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fwgn\" (UniqueName: \"kubernetes.io/projected/2230cdcb-087e-4882-8aea-c5d850b711ac-kube-api-access-9fwgn\") pod \"keystone-d970-account-create-update-lkc7z\" (UID: \"2230cdcb-087e-4882-8aea-c5d850b711ac\") " pod="openstack/keystone-d970-account-create-update-lkc7z" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.589307 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2230cdcb-087e-4882-8aea-c5d850b711ac-operator-scripts\") pod \"keystone-d970-account-create-update-lkc7z\" (UID: \"2230cdcb-087e-4882-8aea-c5d850b711ac\") " pod="openstack/keystone-d970-account-create-update-lkc7z" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.598264 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rb248"] Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.606684 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fwgn\" (UniqueName: \"kubernetes.io/projected/2230cdcb-087e-4882-8aea-c5d850b711ac-kube-api-access-9fwgn\") pod \"keystone-d970-account-create-update-lkc7z\" (UID: \"2230cdcb-087e-4882-8aea-c5d850b711ac\") " pod="openstack/keystone-d970-account-create-update-lkc7z" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.621452 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d9fb-account-create-update-5jvwd"] Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.630657 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gmczg" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.633240 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d9fb-account-create-update-5jvwd" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.633380 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d9fb-account-create-update-5jvwd"] Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.637411 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.689300 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5qkz\" (UniqueName: \"kubernetes.io/projected/737740b8-437c-4c6a-a16f-ac0afcf40b95-kube-api-access-g5qkz\") pod \"placement-db-create-rb248\" (UID: \"737740b8-437c-4c6a-a16f-ac0afcf40b95\") " pod="openstack/placement-db-create-rb248" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.689356 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/737740b8-437c-4c6a-a16f-ac0afcf40b95-operator-scripts\") pod \"placement-db-create-rb248\" (UID: \"737740b8-437c-4c6a-a16f-ac0afcf40b95\") " pod="openstack/placement-db-create-rb248" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.742976 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d970-account-create-update-lkc7z" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.792128 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5qkz\" (UniqueName: \"kubernetes.io/projected/737740b8-437c-4c6a-a16f-ac0afcf40b95-kube-api-access-g5qkz\") pod \"placement-db-create-rb248\" (UID: \"737740b8-437c-4c6a-a16f-ac0afcf40b95\") " pod="openstack/placement-db-create-rb248" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.792204 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5601ea4-ee81-4e2a-b370-268652332465-operator-scripts\") pod \"placement-d9fb-account-create-update-5jvwd\" (UID: \"c5601ea4-ee81-4e2a-b370-268652332465\") " pod="openstack/placement-d9fb-account-create-update-5jvwd" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.792232 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/737740b8-437c-4c6a-a16f-ac0afcf40b95-operator-scripts\") pod \"placement-db-create-rb248\" (UID: \"737740b8-437c-4c6a-a16f-ac0afcf40b95\") " pod="openstack/placement-db-create-rb248" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.792281 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npm9t\" (UniqueName: \"kubernetes.io/projected/c5601ea4-ee81-4e2a-b370-268652332465-kube-api-access-npm9t\") pod \"placement-d9fb-account-create-update-5jvwd\" (UID: \"c5601ea4-ee81-4e2a-b370-268652332465\") " pod="openstack/placement-d9fb-account-create-update-5jvwd" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.793296 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/737740b8-437c-4c6a-a16f-ac0afcf40b95-operator-scripts\") pod \"placement-db-create-rb248\" (UID: \"737740b8-437c-4c6a-a16f-ac0afcf40b95\") " pod="openstack/placement-db-create-rb248" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.814186 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5qkz\" (UniqueName: \"kubernetes.io/projected/737740b8-437c-4c6a-a16f-ac0afcf40b95-kube-api-access-g5qkz\") pod \"placement-db-create-rb248\" (UID: \"737740b8-437c-4c6a-a16f-ac0afcf40b95\") " pod="openstack/placement-db-create-rb248" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.871277 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rb248" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.894481 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5601ea4-ee81-4e2a-b370-268652332465-operator-scripts\") pod \"placement-d9fb-account-create-update-5jvwd\" (UID: \"c5601ea4-ee81-4e2a-b370-268652332465\") " pod="openstack/placement-d9fb-account-create-update-5jvwd" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.894595 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npm9t\" (UniqueName: \"kubernetes.io/projected/c5601ea4-ee81-4e2a-b370-268652332465-kube-api-access-npm9t\") pod \"placement-d9fb-account-create-update-5jvwd\" (UID: \"c5601ea4-ee81-4e2a-b370-268652332465\") " pod="openstack/placement-d9fb-account-create-update-5jvwd" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.895259 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5601ea4-ee81-4e2a-b370-268652332465-operator-scripts\") pod \"placement-d9fb-account-create-update-5jvwd\" (UID: \"c5601ea4-ee81-4e2a-b370-268652332465\") " pod="openstack/placement-d9fb-account-create-update-5jvwd" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.914569 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npm9t\" (UniqueName: \"kubernetes.io/projected/c5601ea4-ee81-4e2a-b370-268652332465-kube-api-access-npm9t\") pod \"placement-d9fb-account-create-update-5jvwd\" (UID: \"c5601ea4-ee81-4e2a-b370-268652332465\") " pod="openstack/placement-d9fb-account-create-update-5jvwd" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.954377 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d9fb-account-create-update-5jvwd" Mar 13 12:05:48 crc kubenswrapper[4837]: W0313 12:05:48.518171 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5601ea4_ee81_4e2a_b370_268652332465.slice/crio-390d3e9855f3f57028fb161478457ea4b1e4ef62c19e2881ed44da5bdeee6e27 WatchSource:0}: Error finding container 390d3e9855f3f57028fb161478457ea4b1e4ef62c19e2881ed44da5bdeee6e27: Status 404 returned error can't find the container with id 390d3e9855f3f57028fb161478457ea4b1e4ef62c19e2881ed44da5bdeee6e27 Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.519285 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d9fb-account-create-update-5jvwd"] Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.581551 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rb248"] Mar 13 12:05:48 crc kubenswrapper[4837]: W0313 12:05:48.605982 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod737740b8_437c_4c6a_a16f_ac0afcf40b95.slice/crio-d70df55b1790a1ebbe6a4a6a6fb3d1059c01bf3fda86f5702d2b75048a834895 WatchSource:0}: Error finding container d70df55b1790a1ebbe6a4a6a6fb3d1059c01bf3fda86f5702d2b75048a834895: Status 404 returned error can't find the container with id d70df55b1790a1ebbe6a4a6a6fb3d1059c01bf3fda86f5702d2b75048a834895 Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.652858 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-gmczg"] Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.670430 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d970-account-create-update-lkc7z"] Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.815877 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.864582 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d970-account-create-update-lkc7z" event={"ID":"2230cdcb-087e-4882-8aea-c5d850b711ac","Type":"ContainerStarted","Data":"ba21907e7c2549bba4ed2433e390f6db6aab42f1bc7683bed090aa5abb21d188"} Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.866447 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-69xgx" event={"ID":"24998567-afa6-4adc-a503-4fc054946aef","Type":"ContainerStarted","Data":"ab5c46268962acc94f8e7f96b6af1d93a9a7a4507799423762e91ef22d7a30a9"} Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.867919 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d9fb-account-create-update-5jvwd" event={"ID":"c5601ea4-ee81-4e2a-b370-268652332465","Type":"ContainerStarted","Data":"258afa8ad3c4b205a4d5ebbc2dad025a8beb1c8bcd26054b8547c8dad13f8f6c"} Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.867976 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d9fb-account-create-update-5jvwd" event={"ID":"c5601ea4-ee81-4e2a-b370-268652332465","Type":"ContainerStarted","Data":"390d3e9855f3f57028fb161478457ea4b1e4ef62c19e2881ed44da5bdeee6e27"} Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.869351 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gmczg" event={"ID":"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e","Type":"ContainerStarted","Data":"b098a43f2d9d03bb99f812f96bcd81706cca6018f0407d1c0e0208765a59836f"} Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.871889 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rb248" event={"ID":"737740b8-437c-4c6a-a16f-ac0afcf40b95","Type":"ContainerStarted","Data":"d70df55b1790a1ebbe6a4a6a6fb3d1059c01bf3fda86f5702d2b75048a834895"} Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.874598 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7vs6f"] Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.874834 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" podUID="ae5deee0-59c4-4fa7-8d8c-e12b516885dc" containerName="dnsmasq-dns" containerID="cri-o://5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f" gracePeriod=10 Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.897850 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-69xgx" podStartSLOduration=2.213518497 podStartE2EDuration="5.89781357s" podCreationTimestamp="2026-03-13 12:05:43 +0000 UTC" firstStartedPulling="2026-03-13 12:05:44.382503534 +0000 UTC m=+1060.020770297" lastFinishedPulling="2026-03-13 12:05:48.066798607 +0000 UTC m=+1063.705065370" observedRunningTime="2026-03-13 12:05:48.894788134 +0000 UTC m=+1064.533054907" watchObservedRunningTime="2026-03-13 12:05:48.89781357 +0000 UTC m=+1064.536080333" Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.923149 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-d9fb-account-create-update-5jvwd" podStartSLOduration=1.9231266420000002 podStartE2EDuration="1.923126642s" podCreationTimestamp="2026-03-13 12:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:05:48.914591371 +0000 UTC m=+1064.552858134" watchObservedRunningTime="2026-03-13 12:05:48.923126642 +0000 UTC m=+1064.561393415" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.405735 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.530179 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhg95\" (UniqueName: \"kubernetes.io/projected/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-kube-api-access-lhg95\") pod \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\" (UID: \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\") " Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.530296 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-dns-svc\") pod \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\" (UID: \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\") " Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.530377 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-config\") pod \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\" (UID: \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\") " Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.543891 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-kube-api-access-lhg95" (OuterVolumeSpecName: "kube-api-access-lhg95") pod "ae5deee0-59c4-4fa7-8d8c-e12b516885dc" (UID: "ae5deee0-59c4-4fa7-8d8c-e12b516885dc"). InnerVolumeSpecName "kube-api-access-lhg95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.595383 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ae5deee0-59c4-4fa7-8d8c-e12b516885dc" (UID: "ae5deee0-59c4-4fa7-8d8c-e12b516885dc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.608955 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-config" (OuterVolumeSpecName: "config") pod "ae5deee0-59c4-4fa7-8d8c-e12b516885dc" (UID: "ae5deee0-59c4-4fa7-8d8c-e12b516885dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.631860 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.631896 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.631906 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhg95\" (UniqueName: \"kubernetes.io/projected/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-kube-api-access-lhg95\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.860544 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-tgg8d"] Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.867682 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-tgg8d"] Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.879401 4837 generic.go:334] "Generic (PLEG): container finished" podID="5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e" containerID="e460ab529bcbaef415dda78934a987cdd80d8b23f4cad796d19dcd468ce2d5f7" exitCode=0 Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.879449 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gmczg" event={"ID":"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e","Type":"ContainerDied","Data":"e460ab529bcbaef415dda78934a987cdd80d8b23f4cad796d19dcd468ce2d5f7"} Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.881347 4837 generic.go:334] "Generic (PLEG): container finished" podID="ae5deee0-59c4-4fa7-8d8c-e12b516885dc" containerID="5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f" exitCode=0 Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.881406 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" event={"ID":"ae5deee0-59c4-4fa7-8d8c-e12b516885dc","Type":"ContainerDied","Data":"5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f"} Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.881437 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" event={"ID":"ae5deee0-59c4-4fa7-8d8c-e12b516885dc","Type":"ContainerDied","Data":"dac372c91256bbeabdb4ee95a6241431746c1c91b7bb9f40ca6c3bd206fe1f51"} Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.881458 4837 scope.go:117] "RemoveContainer" containerID="5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.881560 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.885724 4837 generic.go:334] "Generic (PLEG): container finished" podID="737740b8-437c-4c6a-a16f-ac0afcf40b95" containerID="7c2129e0048255a871372a3d7023ed828ca0d6f1f4e610da012f5353ff07c822" exitCode=0 Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.885819 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rb248" event={"ID":"737740b8-437c-4c6a-a16f-ac0afcf40b95","Type":"ContainerDied","Data":"7c2129e0048255a871372a3d7023ed828ca0d6f1f4e610da012f5353ff07c822"} Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.886939 4837 generic.go:334] "Generic (PLEG): container finished" podID="2230cdcb-087e-4882-8aea-c5d850b711ac" containerID="40deea41e769b1017207ec620ac05bd1eeae7028c9b2f3cacb4bc02a7f4fffdf" exitCode=0 Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.886986 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d970-account-create-update-lkc7z" event={"ID":"2230cdcb-087e-4882-8aea-c5d850b711ac","Type":"ContainerDied","Data":"40deea41e769b1017207ec620ac05bd1eeae7028c9b2f3cacb4bc02a7f4fffdf"} Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.888071 4837 generic.go:334] "Generic (PLEG): container finished" podID="c5601ea4-ee81-4e2a-b370-268652332465" containerID="258afa8ad3c4b205a4d5ebbc2dad025a8beb1c8bcd26054b8547c8dad13f8f6c" exitCode=0 Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.888976 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d9fb-account-create-update-5jvwd" event={"ID":"c5601ea4-ee81-4e2a-b370-268652332465","Type":"ContainerDied","Data":"258afa8ad3c4b205a4d5ebbc2dad025a8beb1c8bcd26054b8547c8dad13f8f6c"} Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.906245 4837 scope.go:117] "RemoveContainer" containerID="86bd795421dbb8f853056d0ba23fbd4446b0d4a512f91ff6b5cf999a58e9698a" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.949518 4837 scope.go:117] "RemoveContainer" containerID="5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f" Mar 13 12:05:49 crc kubenswrapper[4837]: E0313 12:05:49.950982 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f\": container with ID starting with 5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f not found: ID does not exist" containerID="5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.951064 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f"} err="failed to get container status \"5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f\": rpc error: code = NotFound desc = could not find container \"5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f\": container with ID starting with 5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f not found: ID does not exist" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.951109 4837 scope.go:117] "RemoveContainer" containerID="86bd795421dbb8f853056d0ba23fbd4446b0d4a512f91ff6b5cf999a58e9698a" Mar 13 12:05:49 crc kubenswrapper[4837]: E0313 12:05:49.951477 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86bd795421dbb8f853056d0ba23fbd4446b0d4a512f91ff6b5cf999a58e9698a\": container with ID starting with 86bd795421dbb8f853056d0ba23fbd4446b0d4a512f91ff6b5cf999a58e9698a not found: ID does not exist" containerID="86bd795421dbb8f853056d0ba23fbd4446b0d4a512f91ff6b5cf999a58e9698a" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.951529 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86bd795421dbb8f853056d0ba23fbd4446b0d4a512f91ff6b5cf999a58e9698a"} err="failed to get container status \"86bd795421dbb8f853056d0ba23fbd4446b0d4a512f91ff6b5cf999a58e9698a\": rpc error: code = NotFound desc = could not find container \"86bd795421dbb8f853056d0ba23fbd4446b0d4a512f91ff6b5cf999a58e9698a\": container with ID starting with 86bd795421dbb8f853056d0ba23fbd4446b0d4a512f91ff6b5cf999a58e9698a not found: ID does not exist" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.965478 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7vs6f"] Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.972798 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7vs6f"] Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.063839 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae5deee0-59c4-4fa7-8d8c-e12b516885dc" path="/var/lib/kubelet/pods/ae5deee0-59c4-4fa7-8d8c-e12b516885dc/volumes" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.064775 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2c8bb14-f7ed-4a97-a6f8-73f67824897e" path="/var/lib/kubelet/pods/b2c8bb14-f7ed-4a97-a6f8-73f67824897e/volumes" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.249490 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rb248" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.361730 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5qkz\" (UniqueName: \"kubernetes.io/projected/737740b8-437c-4c6a-a16f-ac0afcf40b95-kube-api-access-g5qkz\") pod \"737740b8-437c-4c6a-a16f-ac0afcf40b95\" (UID: \"737740b8-437c-4c6a-a16f-ac0afcf40b95\") " Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.361827 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/737740b8-437c-4c6a-a16f-ac0afcf40b95-operator-scripts\") pod \"737740b8-437c-4c6a-a16f-ac0afcf40b95\" (UID: \"737740b8-437c-4c6a-a16f-ac0afcf40b95\") " Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.362760 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/737740b8-437c-4c6a-a16f-ac0afcf40b95-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "737740b8-437c-4c6a-a16f-ac0afcf40b95" (UID: "737740b8-437c-4c6a-a16f-ac0afcf40b95"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.372010 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/737740b8-437c-4c6a-a16f-ac0afcf40b95-kube-api-access-g5qkz" (OuterVolumeSpecName: "kube-api-access-g5qkz") pod "737740b8-437c-4c6a-a16f-ac0afcf40b95" (UID: "737740b8-437c-4c6a-a16f-ac0afcf40b95"). InnerVolumeSpecName "kube-api-access-g5qkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.464387 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d970-account-create-update-lkc7z" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.464515 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5qkz\" (UniqueName: \"kubernetes.io/projected/737740b8-437c-4c6a-a16f-ac0afcf40b95-kube-api-access-g5qkz\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.464943 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/737740b8-437c-4c6a-a16f-ac0afcf40b95-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.474935 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d9fb-account-create-update-5jvwd" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.488039 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gmczg" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.566011 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5601ea4-ee81-4e2a-b370-268652332465-operator-scripts\") pod \"c5601ea4-ee81-4e2a-b370-268652332465\" (UID: \"c5601ea4-ee81-4e2a-b370-268652332465\") " Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.566065 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e-operator-scripts\") pod \"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e\" (UID: \"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e\") " Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.566179 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j7s8\" (UniqueName: \"kubernetes.io/projected/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e-kube-api-access-9j7s8\") pod \"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e\" (UID: \"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e\") " Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.566247 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fwgn\" (UniqueName: \"kubernetes.io/projected/2230cdcb-087e-4882-8aea-c5d850b711ac-kube-api-access-9fwgn\") pod \"2230cdcb-087e-4882-8aea-c5d850b711ac\" (UID: \"2230cdcb-087e-4882-8aea-c5d850b711ac\") " Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.566296 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2230cdcb-087e-4882-8aea-c5d850b711ac-operator-scripts\") pod \"2230cdcb-087e-4882-8aea-c5d850b711ac\" (UID: \"2230cdcb-087e-4882-8aea-c5d850b711ac\") " Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.566349 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npm9t\" (UniqueName: \"kubernetes.io/projected/c5601ea4-ee81-4e2a-b370-268652332465-kube-api-access-npm9t\") pod \"c5601ea4-ee81-4e2a-b370-268652332465\" (UID: \"c5601ea4-ee81-4e2a-b370-268652332465\") " Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.566536 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5601ea4-ee81-4e2a-b370-268652332465-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c5601ea4-ee81-4e2a-b370-268652332465" (UID: "c5601ea4-ee81-4e2a-b370-268652332465"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.567018 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5601ea4-ee81-4e2a-b370-268652332465-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.567385 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e" (UID: "5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.567381 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2230cdcb-087e-4882-8aea-c5d850b711ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2230cdcb-087e-4882-8aea-c5d850b711ac" (UID: "2230cdcb-087e-4882-8aea-c5d850b711ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.569427 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2230cdcb-087e-4882-8aea-c5d850b711ac-kube-api-access-9fwgn" (OuterVolumeSpecName: "kube-api-access-9fwgn") pod "2230cdcb-087e-4882-8aea-c5d850b711ac" (UID: "2230cdcb-087e-4882-8aea-c5d850b711ac"). InnerVolumeSpecName "kube-api-access-9fwgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.569927 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e-kube-api-access-9j7s8" (OuterVolumeSpecName: "kube-api-access-9j7s8") pod "5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e" (UID: "5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e"). InnerVolumeSpecName "kube-api-access-9j7s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.573138 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5601ea4-ee81-4e2a-b370-268652332465-kube-api-access-npm9t" (OuterVolumeSpecName: "kube-api-access-npm9t") pod "c5601ea4-ee81-4e2a-b370-268652332465" (UID: "c5601ea4-ee81-4e2a-b370-268652332465"). InnerVolumeSpecName "kube-api-access-npm9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.651142 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-jkthw"] Mar 13 12:05:51 crc kubenswrapper[4837]: E0313 12:05:51.651534 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae5deee0-59c4-4fa7-8d8c-e12b516885dc" containerName="init" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.651553 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae5deee0-59c4-4fa7-8d8c-e12b516885dc" containerName="init" Mar 13 12:05:51 crc kubenswrapper[4837]: E0313 12:05:51.651566 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e" containerName="mariadb-database-create" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.651574 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e" containerName="mariadb-database-create" Mar 13 12:05:51 crc kubenswrapper[4837]: E0313 12:05:51.651591 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737740b8-437c-4c6a-a16f-ac0afcf40b95" containerName="mariadb-database-create" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.651600 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="737740b8-437c-4c6a-a16f-ac0afcf40b95" containerName="mariadb-database-create" Mar 13 12:05:51 crc kubenswrapper[4837]: E0313 12:05:51.651615 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae5deee0-59c4-4fa7-8d8c-e12b516885dc" containerName="dnsmasq-dns" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.651623 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae5deee0-59c4-4fa7-8d8c-e12b516885dc" containerName="dnsmasq-dns" Mar 13 12:05:51 crc kubenswrapper[4837]: E0313 12:05:51.651658 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5601ea4-ee81-4e2a-b370-268652332465" containerName="mariadb-account-create-update" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.651668 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5601ea4-ee81-4e2a-b370-268652332465" containerName="mariadb-account-create-update" Mar 13 12:05:51 crc kubenswrapper[4837]: E0313 12:05:51.651695 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2230cdcb-087e-4882-8aea-c5d850b711ac" containerName="mariadb-account-create-update" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.651705 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2230cdcb-087e-4882-8aea-c5d850b711ac" containerName="mariadb-account-create-update" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.651895 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae5deee0-59c4-4fa7-8d8c-e12b516885dc" containerName="dnsmasq-dns" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.651914 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="737740b8-437c-4c6a-a16f-ac0afcf40b95" containerName="mariadb-database-create" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.651927 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2230cdcb-087e-4882-8aea-c5d850b711ac" containerName="mariadb-account-create-update" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.651942 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e" containerName="mariadb-database-create" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.651956 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5601ea4-ee81-4e2a-b370-268652332465" containerName="mariadb-account-create-update" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.652614 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.658714 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.658731 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-dvhzm" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.660163 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-jkthw"] Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.670618 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fwgn\" (UniqueName: \"kubernetes.io/projected/2230cdcb-087e-4882-8aea-c5d850b711ac-kube-api-access-9fwgn\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.670688 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2230cdcb-087e-4882-8aea-c5d850b711ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.670709 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npm9t\" (UniqueName: \"kubernetes.io/projected/c5601ea4-ee81-4e2a-b370-268652332465-kube-api-access-npm9t\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.670725 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.670741 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j7s8\" (UniqueName: \"kubernetes.io/projected/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e-kube-api-access-9j7s8\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.772546 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-config-data\") pod \"glance-db-sync-jkthw\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.772709 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zh4r\" (UniqueName: \"kubernetes.io/projected/b4490fb3-45d7-4b40-ad34-5bf33ba88491-kube-api-access-5zh4r\") pod \"glance-db-sync-jkthw\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.772758 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-combined-ca-bundle\") pod \"glance-db-sync-jkthw\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.772878 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-db-sync-config-data\") pod \"glance-db-sync-jkthw\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.874881 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-config-data\") pod \"glance-db-sync-jkthw\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.874965 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zh4r\" (UniqueName: \"kubernetes.io/projected/b4490fb3-45d7-4b40-ad34-5bf33ba88491-kube-api-access-5zh4r\") pod \"glance-db-sync-jkthw\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.875007 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-combined-ca-bundle\") pod \"glance-db-sync-jkthw\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.875101 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-db-sync-config-data\") pod \"glance-db-sync-jkthw\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.879460 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-db-sync-config-data\") pod \"glance-db-sync-jkthw\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.879762 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-combined-ca-bundle\") pod \"glance-db-sync-jkthw\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.880139 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-config-data\") pod \"glance-db-sync-jkthw\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.895241 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zh4r\" (UniqueName: \"kubernetes.io/projected/b4490fb3-45d7-4b40-ad34-5bf33ba88491-kube-api-access-5zh4r\") pod \"glance-db-sync-jkthw\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.907998 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d9fb-account-create-update-5jvwd" event={"ID":"c5601ea4-ee81-4e2a-b370-268652332465","Type":"ContainerDied","Data":"390d3e9855f3f57028fb161478457ea4b1e4ef62c19e2881ed44da5bdeee6e27"} Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.908054 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="390d3e9855f3f57028fb161478457ea4b1e4ef62c19e2881ed44da5bdeee6e27" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.908119 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d9fb-account-create-update-5jvwd" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.911028 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gmczg" event={"ID":"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e","Type":"ContainerDied","Data":"b098a43f2d9d03bb99f812f96bcd81706cca6018f0407d1c0e0208765a59836f"} Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.911070 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b098a43f2d9d03bb99f812f96bcd81706cca6018f0407d1c0e0208765a59836f" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.911125 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gmczg" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.921036 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rb248" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.921034 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rb248" event={"ID":"737740b8-437c-4c6a-a16f-ac0afcf40b95","Type":"ContainerDied","Data":"d70df55b1790a1ebbe6a4a6a6fb3d1059c01bf3fda86f5702d2b75048a834895"} Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.921319 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d70df55b1790a1ebbe6a4a6a6fb3d1059c01bf3fda86f5702d2b75048a834895" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.923512 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d970-account-create-update-lkc7z" event={"ID":"2230cdcb-087e-4882-8aea-c5d850b711ac","Type":"ContainerDied","Data":"ba21907e7c2549bba4ed2433e390f6db6aab42f1bc7683bed090aa5abb21d188"} Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.923542 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba21907e7c2549bba4ed2433e390f6db6aab42f1bc7683bed090aa5abb21d188" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.927966 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d970-account-create-update-lkc7z" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.970969 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:52 crc kubenswrapper[4837]: I0313 12:05:52.481384 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-jkthw"] Mar 13 12:05:52 crc kubenswrapper[4837]: I0313 12:05:52.933450 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jkthw" event={"ID":"b4490fb3-45d7-4b40-ad34-5bf33ba88491","Type":"ContainerStarted","Data":"17d86872aee9655dc63bbe1e8b164cedfec91be43293c0487555e85e1e22c479"} Mar 13 12:05:54 crc kubenswrapper[4837]: I0313 12:05:54.849747 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-zgdc9"] Mar 13 12:05:54 crc kubenswrapper[4837]: I0313 12:05:54.850713 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zgdc9" Mar 13 12:05:54 crc kubenswrapper[4837]: I0313 12:05:54.854393 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 13 12:05:54 crc kubenswrapper[4837]: I0313 12:05:54.858887 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zgdc9"] Mar 13 12:05:54 crc kubenswrapper[4837]: I0313 12:05:54.926397 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4x4s\" (UniqueName: \"kubernetes.io/projected/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f-kube-api-access-w4x4s\") pod \"root-account-create-update-zgdc9\" (UID: \"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f\") " pod="openstack/root-account-create-update-zgdc9" Mar 13 12:05:54 crc kubenswrapper[4837]: I0313 12:05:54.926494 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f-operator-scripts\") pod \"root-account-create-update-zgdc9\" (UID: \"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f\") " pod="openstack/root-account-create-update-zgdc9" Mar 13 12:05:54 crc kubenswrapper[4837]: I0313 12:05:54.959754 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.028712 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4x4s\" (UniqueName: \"kubernetes.io/projected/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f-kube-api-access-w4x4s\") pod \"root-account-create-update-zgdc9\" (UID: \"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f\") " pod="openstack/root-account-create-update-zgdc9" Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.028832 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f-operator-scripts\") pod \"root-account-create-update-zgdc9\" (UID: \"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f\") " pod="openstack/root-account-create-update-zgdc9" Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.030737 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f-operator-scripts\") pod \"root-account-create-update-zgdc9\" (UID: \"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f\") " pod="openstack/root-account-create-update-zgdc9" Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.055135 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4x4s\" (UniqueName: \"kubernetes.io/projected/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f-kube-api-access-w4x4s\") pod \"root-account-create-update-zgdc9\" (UID: \"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f\") " pod="openstack/root-account-create-update-zgdc9" Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.215729 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zgdc9" Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.442429 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.451073 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.514375 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.663505 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zgdc9"] Mar 13 12:05:55 crc kubenswrapper[4837]: W0313 12:05:55.673101 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f1ebb88_e1c9_4839_9c66_8bd86e4b0d5f.slice/crio-50d96f39e72fd0fb719f907fcf63961ca9fa32d054985b3b8e4e41d43e6a3970 WatchSource:0}: Error finding container 50d96f39e72fd0fb719f907fcf63961ca9fa32d054985b3b8e4e41d43e6a3970: Status 404 returned error can't find the container with id 50d96f39e72fd0fb719f907fcf63961ca9fa32d054985b3b8e4e41d43e6a3970 Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.959876 4837 generic.go:334] "Generic (PLEG): container finished" podID="24998567-afa6-4adc-a503-4fc054946aef" containerID="ab5c46268962acc94f8e7f96b6af1d93a9a7a4507799423762e91ef22d7a30a9" exitCode=0 Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.959906 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-69xgx" event={"ID":"24998567-afa6-4adc-a503-4fc054946aef","Type":"ContainerDied","Data":"ab5c46268962acc94f8e7f96b6af1d93a9a7a4507799423762e91ef22d7a30a9"} Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.962223 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zgdc9" event={"ID":"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f","Type":"ContainerStarted","Data":"248bbf02c11ba4d4459897916fec2f24105abad663f25d012f6888d993c3fbac"} Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.962252 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zgdc9" event={"ID":"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f","Type":"ContainerStarted","Data":"50d96f39e72fd0fb719f907fcf63961ca9fa32d054985b3b8e4e41d43e6a3970"} Mar 13 12:05:56 crc kubenswrapper[4837]: I0313 12:05:56.003562 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-zgdc9" podStartSLOduration=2.003540309 podStartE2EDuration="2.003540309s" podCreationTimestamp="2026-03-13 12:05:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:05:55.996400232 +0000 UTC m=+1071.634667005" watchObservedRunningTime="2026-03-13 12:05:56.003540309 +0000 UTC m=+1071.641807072" Mar 13 12:05:56 crc kubenswrapper[4837]: I0313 12:05:56.139716 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 13 12:05:56 crc kubenswrapper[4837]: W0313 12:05:56.175900 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59565710_b9bc_46e6_ad92_7f12376de17c.slice/crio-2fe1e5cd43292f1fe51e0913fb9f74c254e49b17cae827baa09a3570dfd0d830 WatchSource:0}: Error finding container 2fe1e5cd43292f1fe51e0913fb9f74c254e49b17cae827baa09a3570dfd0d830: Status 404 returned error can't find the container with id 2fe1e5cd43292f1fe51e0913fb9f74c254e49b17cae827baa09a3570dfd0d830 Mar 13 12:05:56 crc kubenswrapper[4837]: I0313 12:05:56.972412 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nbhpw" podUID="32dc51d9-5638-4530-91c8-5be8c13e60f3" containerName="ovn-controller" probeResult="failure" output=< Mar 13 12:05:56 crc kubenswrapper[4837]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 13 12:05:56 crc kubenswrapper[4837]: > Mar 13 12:05:56 crc kubenswrapper[4837]: I0313 12:05:56.974418 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"2fe1e5cd43292f1fe51e0913fb9f74c254e49b17cae827baa09a3570dfd0d830"} Mar 13 12:05:56 crc kubenswrapper[4837]: I0313 12:05:56.975568 4837 generic.go:334] "Generic (PLEG): container finished" podID="2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f" containerID="248bbf02c11ba4d4459897916fec2f24105abad663f25d012f6888d993c3fbac" exitCode=0 Mar 13 12:05:56 crc kubenswrapper[4837]: I0313 12:05:56.975985 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zgdc9" event={"ID":"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f","Type":"ContainerDied","Data":"248bbf02c11ba4d4459897916fec2f24105abad663f25d012f6888d993c3fbac"} Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.092429 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.433530 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.585180 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxh5\" (UniqueName: \"kubernetes.io/projected/24998567-afa6-4adc-a503-4fc054946aef-kube-api-access-pcxh5\") pod \"24998567-afa6-4adc-a503-4fc054946aef\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.585293 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/24998567-afa6-4adc-a503-4fc054946aef-ring-data-devices\") pod \"24998567-afa6-4adc-a503-4fc054946aef\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.585331 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24998567-afa6-4adc-a503-4fc054946aef-scripts\") pod \"24998567-afa6-4adc-a503-4fc054946aef\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.585372 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-swiftconf\") pod \"24998567-afa6-4adc-a503-4fc054946aef\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.585416 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-combined-ca-bundle\") pod \"24998567-afa6-4adc-a503-4fc054946aef\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.585450 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/24998567-afa6-4adc-a503-4fc054946aef-etc-swift\") pod \"24998567-afa6-4adc-a503-4fc054946aef\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.586173 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24998567-afa6-4adc-a503-4fc054946aef-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "24998567-afa6-4adc-a503-4fc054946aef" (UID: "24998567-afa6-4adc-a503-4fc054946aef"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.586804 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24998567-afa6-4adc-a503-4fc054946aef-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "24998567-afa6-4adc-a503-4fc054946aef" (UID: "24998567-afa6-4adc-a503-4fc054946aef"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.587298 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-dispersionconf\") pod \"24998567-afa6-4adc-a503-4fc054946aef\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.588554 4837 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/24998567-afa6-4adc-a503-4fc054946aef-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.588586 4837 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/24998567-afa6-4adc-a503-4fc054946aef-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.592601 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24998567-afa6-4adc-a503-4fc054946aef-kube-api-access-pcxh5" (OuterVolumeSpecName: "kube-api-access-pcxh5") pod "24998567-afa6-4adc-a503-4fc054946aef" (UID: "24998567-afa6-4adc-a503-4fc054946aef"). InnerVolumeSpecName "kube-api-access-pcxh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.603819 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "24998567-afa6-4adc-a503-4fc054946aef" (UID: "24998567-afa6-4adc-a503-4fc054946aef"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.622978 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24998567-afa6-4adc-a503-4fc054946aef" (UID: "24998567-afa6-4adc-a503-4fc054946aef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.630004 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "24998567-afa6-4adc-a503-4fc054946aef" (UID: "24998567-afa6-4adc-a503-4fc054946aef"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.640140 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24998567-afa6-4adc-a503-4fc054946aef-scripts" (OuterVolumeSpecName: "scripts") pod "24998567-afa6-4adc-a503-4fc054946aef" (UID: "24998567-afa6-4adc-a503-4fc054946aef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.690471 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24998567-afa6-4adc-a503-4fc054946aef-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.690825 4837 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.690837 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.690849 4837 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.690859 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxh5\" (UniqueName: \"kubernetes.io/projected/24998567-afa6-4adc-a503-4fc054946aef-kube-api-access-pcxh5\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:58 crc kubenswrapper[4837]: I0313 12:05:58.019532 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"187247b205d7cc45292d712b11cea043a6f8c69a568d8a89e720e74e571f5b51"} Mar 13 12:05:58 crc kubenswrapper[4837]: I0313 12:05:58.021551 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"00c6ab786fb53319d36acf5ced6f02a3ba152d4adfca8fe3dec6e9106fb84434"} Mar 13 12:05:58 crc kubenswrapper[4837]: I0313 12:05:58.028220 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:58 crc kubenswrapper[4837]: I0313 12:05:58.028312 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-69xgx" event={"ID":"24998567-afa6-4adc-a503-4fc054946aef","Type":"ContainerDied","Data":"1ed0808ad30fc86e0f532b5d60900d3ad34dad8095b22ae074e18418eb961f7b"} Mar 13 12:05:58 crc kubenswrapper[4837]: I0313 12:05:58.028361 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ed0808ad30fc86e0f532b5d60900d3ad34dad8095b22ae074e18418eb961f7b" Mar 13 12:05:58 crc kubenswrapper[4837]: E0313 12:05:58.105793 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24998567_afa6_4adc_a503_4fc054946aef.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:05:58 crc kubenswrapper[4837]: I0313 12:05:58.583576 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zgdc9" Mar 13 12:05:58 crc kubenswrapper[4837]: I0313 12:05:58.712562 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f-operator-scripts\") pod \"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f\" (UID: \"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f\") " Mar 13 12:05:58 crc kubenswrapper[4837]: I0313 12:05:58.712738 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4x4s\" (UniqueName: \"kubernetes.io/projected/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f-kube-api-access-w4x4s\") pod \"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f\" (UID: \"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f\") " Mar 13 12:05:58 crc kubenswrapper[4837]: I0313 12:05:58.714002 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f" (UID: "2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:58 crc kubenswrapper[4837]: I0313 12:05:58.720019 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f-kube-api-access-w4x4s" (OuterVolumeSpecName: "kube-api-access-w4x4s") pod "2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f" (UID: "2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f"). InnerVolumeSpecName "kube-api-access-w4x4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:58 crc kubenswrapper[4837]: I0313 12:05:58.814402 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4x4s\" (UniqueName: \"kubernetes.io/projected/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f-kube-api-access-w4x4s\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:58 crc kubenswrapper[4837]: I0313 12:05:58.814442 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:59 crc kubenswrapper[4837]: I0313 12:05:59.043460 4837 generic.go:334] "Generic (PLEG): container finished" podID="e7b01be4-73b6-48eb-a06d-4fb38863d982" containerID="afe3a88a0e8205fefe122a8099e4acc29a3ebc22c1a9a9cfe3c00f5ab1794007" exitCode=0 Mar 13 12:05:59 crc kubenswrapper[4837]: I0313 12:05:59.043537 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e7b01be4-73b6-48eb-a06d-4fb38863d982","Type":"ContainerDied","Data":"afe3a88a0e8205fefe122a8099e4acc29a3ebc22c1a9a9cfe3c00f5ab1794007"} Mar 13 12:05:59 crc kubenswrapper[4837]: I0313 12:05:59.071274 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"0c046f2ec93fccdfe07d6437b7eb7f95762c9aa74d5b6a853c7d3a653c626650"} Mar 13 12:05:59 crc kubenswrapper[4837]: I0313 12:05:59.071326 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"a10a1af8e9199733d353c7a39cb8ff947c9d7b4af81a363d438963cbb65562b0"} Mar 13 12:05:59 crc kubenswrapper[4837]: I0313 12:05:59.071464 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zgdc9" event={"ID":"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f","Type":"ContainerDied","Data":"50d96f39e72fd0fb719f907fcf63961ca9fa32d054985b3b8e4e41d43e6a3970"} Mar 13 12:05:59 crc kubenswrapper[4837]: I0313 12:05:59.071506 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50d96f39e72fd0fb719f907fcf63961ca9fa32d054985b3b8e4e41d43e6a3970" Mar 13 12:05:59 crc kubenswrapper[4837]: I0313 12:05:59.071565 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zgdc9" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.084929 4837 generic.go:334] "Generic (PLEG): container finished" podID="13254c8b-516c-435e-9db2-a8d518434f29" containerID="d7e3a2439b933c4a76e0a0472aaff3cf352a36e55d6c5a3aa674478c5299b9be" exitCode=0 Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.085015 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"13254c8b-516c-435e-9db2-a8d518434f29","Type":"ContainerDied","Data":"d7e3a2439b933c4a76e0a0472aaff3cf352a36e55d6c5a3aa674478c5299b9be"} Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.092310 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e7b01be4-73b6-48eb-a06d-4fb38863d982","Type":"ContainerStarted","Data":"616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c"} Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.093073 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.149117 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556726-gdbfm"] Mar 13 12:06:00 crc kubenswrapper[4837]: E0313 12:06:00.149475 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f" containerName="mariadb-account-create-update" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.149489 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f" containerName="mariadb-account-create-update" Mar 13 12:06:00 crc kubenswrapper[4837]: E0313 12:06:00.149507 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24998567-afa6-4adc-a503-4fc054946aef" containerName="swift-ring-rebalance" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.149513 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="24998567-afa6-4adc-a503-4fc054946aef" containerName="swift-ring-rebalance" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.149712 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="24998567-afa6-4adc-a503-4fc054946aef" containerName="swift-ring-rebalance" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.149729 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f" containerName="mariadb-account-create-update" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.150191 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556726-gdbfm" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.152097 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.158017 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.158265 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.183271 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556726-gdbfm"] Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.213351 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=49.564002375 podStartE2EDuration="58.213327017s" podCreationTimestamp="2026-03-13 12:05:02 +0000 UTC" firstStartedPulling="2026-03-13 12:05:15.770907991 +0000 UTC m=+1031.409174754" lastFinishedPulling="2026-03-13 12:05:24.420232633 +0000 UTC m=+1040.058499396" observedRunningTime="2026-03-13 12:06:00.196945378 +0000 UTC m=+1075.835212141" watchObservedRunningTime="2026-03-13 12:06:00.213327017 +0000 UTC m=+1075.851593780" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.247680 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nqtb\" (UniqueName: \"kubernetes.io/projected/83f46fff-3510-4758-82a0-30099640fa33-kube-api-access-9nqtb\") pod \"auto-csr-approver-29556726-gdbfm\" (UID: \"83f46fff-3510-4758-82a0-30099640fa33\") " pod="openshift-infra/auto-csr-approver-29556726-gdbfm" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.349570 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nqtb\" (UniqueName: \"kubernetes.io/projected/83f46fff-3510-4758-82a0-30099640fa33-kube-api-access-9nqtb\") pod \"auto-csr-approver-29556726-gdbfm\" (UID: \"83f46fff-3510-4758-82a0-30099640fa33\") " pod="openshift-infra/auto-csr-approver-29556726-gdbfm" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.372809 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nqtb\" (UniqueName: \"kubernetes.io/projected/83f46fff-3510-4758-82a0-30099640fa33-kube-api-access-9nqtb\") pod \"auto-csr-approver-29556726-gdbfm\" (UID: \"83f46fff-3510-4758-82a0-30099640fa33\") " pod="openshift-infra/auto-csr-approver-29556726-gdbfm" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.481930 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556726-gdbfm" Mar 13 12:06:01 crc kubenswrapper[4837]: I0313 12:06:01.988265 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nbhpw" podUID="32dc51d9-5638-4530-91c8-5be8c13e60f3" containerName="ovn-controller" probeResult="failure" output=< Mar 13 12:06:01 crc kubenswrapper[4837]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 13 12:06:01 crc kubenswrapper[4837]: > Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.065592 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.283700 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nbhpw-config-hrgcj"] Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.284831 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.288904 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.313290 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nbhpw-config-hrgcj"] Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.401620 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-run-ovn\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.401706 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jgwn\" (UniqueName: \"kubernetes.io/projected/6776e647-6987-4359-baa9-14ba621118d2-kube-api-access-5jgwn\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.401745 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-log-ovn\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.401810 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6776e647-6987-4359-baa9-14ba621118d2-additional-scripts\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.401860 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-run\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.401887 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6776e647-6987-4359-baa9-14ba621118d2-scripts\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.503209 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6776e647-6987-4359-baa9-14ba621118d2-scripts\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.503320 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-run-ovn\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.503342 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jgwn\" (UniqueName: \"kubernetes.io/projected/6776e647-6987-4359-baa9-14ba621118d2-kube-api-access-5jgwn\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.503372 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-log-ovn\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.503423 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6776e647-6987-4359-baa9-14ba621118d2-additional-scripts\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.503478 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-run\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.503814 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-run\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.505948 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6776e647-6987-4359-baa9-14ba621118d2-scripts\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.506012 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-run-ovn\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.506319 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-log-ovn\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.506902 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6776e647-6987-4359-baa9-14ba621118d2-additional-scripts\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.539319 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jgwn\" (UniqueName: \"kubernetes.io/projected/6776e647-6987-4359-baa9-14ba621118d2-kube-api-access-5jgwn\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.613845 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:06 crc kubenswrapper[4837]: I0313 12:06:06.969585 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nbhpw" podUID="32dc51d9-5638-4530-91c8-5be8c13e60f3" containerName="ovn-controller" probeResult="failure" output=< Mar 13 12:06:06 crc kubenswrapper[4837]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 13 12:06:06 crc kubenswrapper[4837]: > Mar 13 12:06:10 crc kubenswrapper[4837]: E0313 12:06:10.230784 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Mar 13 12:06:10 crc kubenswrapper[4837]: E0313 12:06:10.231832 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5zh4r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-jkthw_openstack(b4490fb3-45d7-4b40-ad34-5bf33ba88491): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:06:10 crc kubenswrapper[4837]: E0313 12:06:10.233028 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-jkthw" podUID="b4490fb3-45d7-4b40-ad34-5bf33ba88491" Mar 13 12:06:10 crc kubenswrapper[4837]: I0313 12:06:10.687089 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nbhpw-config-hrgcj"] Mar 13 12:06:10 crc kubenswrapper[4837]: W0313 12:06:10.705631 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6776e647_6987_4359_baa9_14ba621118d2.slice/crio-83aae3e77d9623e024f76a9f48851e9cd8dc7696270a9bc666154f49b6134727 WatchSource:0}: Error finding container 83aae3e77d9623e024f76a9f48851e9cd8dc7696270a9bc666154f49b6134727: Status 404 returned error can't find the container with id 83aae3e77d9623e024f76a9f48851e9cd8dc7696270a9bc666154f49b6134727 Mar 13 12:06:10 crc kubenswrapper[4837]: I0313 12:06:10.744135 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556726-gdbfm"] Mar 13 12:06:10 crc kubenswrapper[4837]: W0313 12:06:10.750913 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83f46fff_3510_4758_82a0_30099640fa33.slice/crio-97fa7ceb35dbfdb338b3e21974c4f2697b4dc443944351f3a97c4fa67dd5702a WatchSource:0}: Error finding container 97fa7ceb35dbfdb338b3e21974c4f2697b4dc443944351f3a97c4fa67dd5702a: Status 404 returned error can't find the container with id 97fa7ceb35dbfdb338b3e21974c4f2697b4dc443944351f3a97c4fa67dd5702a Mar 13 12:06:11 crc kubenswrapper[4837]: I0313 12:06:11.229547 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"13254c8b-516c-435e-9db2-a8d518434f29","Type":"ContainerStarted","Data":"0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1"} Mar 13 12:06:11 crc kubenswrapper[4837]: I0313 12:06:11.231565 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:06:11 crc kubenswrapper[4837]: I0313 12:06:11.235468 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nbhpw-config-hrgcj" event={"ID":"6776e647-6987-4359-baa9-14ba621118d2","Type":"ContainerStarted","Data":"8b3e536f3d4311421b7a8a53f994fc3c95b97d5e112a955f101e290d9b221b2d"} Mar 13 12:06:11 crc kubenswrapper[4837]: I0313 12:06:11.235513 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nbhpw-config-hrgcj" event={"ID":"6776e647-6987-4359-baa9-14ba621118d2","Type":"ContainerStarted","Data":"83aae3e77d9623e024f76a9f48851e9cd8dc7696270a9bc666154f49b6134727"} Mar 13 12:06:11 crc kubenswrapper[4837]: I0313 12:06:11.241227 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"34f268940b6244ea00e96250111548b4ac0a41e171f6d03580dec0254f9f213e"} Mar 13 12:06:11 crc kubenswrapper[4837]: I0313 12:06:11.241263 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"166173c05cd8668ae62e105ed92a57900aa789cb5bc89ec73b08c97b367f478c"} Mar 13 12:06:11 crc kubenswrapper[4837]: I0313 12:06:11.241273 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"af33c4644aa0b53237784c5c19cfe673d7980526a0f05cdb546c113f3a1f90ed"} Mar 13 12:06:11 crc kubenswrapper[4837]: I0313 12:06:11.241284 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"dc72455cada2574d7dee9c4d762184172c399e0801f1156290b97c205dba2901"} Mar 13 12:06:11 crc kubenswrapper[4837]: I0313 12:06:11.243218 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556726-gdbfm" event={"ID":"83f46fff-3510-4758-82a0-30099640fa33","Type":"ContainerStarted","Data":"97fa7ceb35dbfdb338b3e21974c4f2697b4dc443944351f3a97c4fa67dd5702a"} Mar 13 12:06:11 crc kubenswrapper[4837]: E0313 12:06:11.243991 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-jkthw" podUID="b4490fb3-45d7-4b40-ad34-5bf33ba88491" Mar 13 12:06:11 crc kubenswrapper[4837]: I0313 12:06:11.274727 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=62.639597458 podStartE2EDuration="1m10.274706475s" podCreationTimestamp="2026-03-13 12:05:01 +0000 UTC" firstStartedPulling="2026-03-13 12:05:16.487863781 +0000 UTC m=+1032.126130544" lastFinishedPulling="2026-03-13 12:05:24.122972798 +0000 UTC m=+1039.761239561" observedRunningTime="2026-03-13 12:06:11.262747186 +0000 UTC m=+1086.901013959" watchObservedRunningTime="2026-03-13 12:06:11.274706475 +0000 UTC m=+1086.912973238" Mar 13 12:06:11 crc kubenswrapper[4837]: I0313 12:06:11.282993 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-nbhpw-config-hrgcj" podStartSLOduration=9.282973796 podStartE2EDuration="9.282973796s" podCreationTimestamp="2026-03-13 12:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:11.279657851 +0000 UTC m=+1086.917924634" watchObservedRunningTime="2026-03-13 12:06:11.282973796 +0000 UTC m=+1086.921240559" Mar 13 12:06:11 crc kubenswrapper[4837]: I0313 12:06:11.973403 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-nbhpw" Mar 13 12:06:12 crc kubenswrapper[4837]: I0313 12:06:12.251825 4837 generic.go:334] "Generic (PLEG): container finished" podID="6776e647-6987-4359-baa9-14ba621118d2" containerID="8b3e536f3d4311421b7a8a53f994fc3c95b97d5e112a955f101e290d9b221b2d" exitCode=0 Mar 13 12:06:12 crc kubenswrapper[4837]: I0313 12:06:12.251959 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nbhpw-config-hrgcj" event={"ID":"6776e647-6987-4359-baa9-14ba621118d2","Type":"ContainerDied","Data":"8b3e536f3d4311421b7a8a53f994fc3c95b97d5e112a955f101e290d9b221b2d"} Mar 13 12:06:12 crc kubenswrapper[4837]: I0313 12:06:12.254615 4837 generic.go:334] "Generic (PLEG): container finished" podID="83f46fff-3510-4758-82a0-30099640fa33" containerID="1858aaffb80ca26b2ecab85a7aa907d93bda6b050db7fd69c55fcebb623536ef" exitCode=0 Mar 13 12:06:12 crc kubenswrapper[4837]: I0313 12:06:12.254667 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556726-gdbfm" event={"ID":"83f46fff-3510-4758-82a0-30099640fa33","Type":"ContainerDied","Data":"1858aaffb80ca26b2ecab85a7aa907d93bda6b050db7fd69c55fcebb623536ef"} Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.267054 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"f527d66eb0a2a1ab3d7502e880eff4ba411c1086452adb7c46216c179a97766b"} Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.268014 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"be4a385c843853220c9ce490f42ed62425d4c6c763d42763c0c52cd9c2057711"} Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.268218 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"03de36fa370480219ae91f786ca94823d401f1b31b4a8b2433f10907447d95a0"} Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.268306 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"864955d1c783a52f4bfa111bbfec7cf156c8f217ae8471d28e2d198fcddaabe1"} Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.488408 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.916469 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556726-gdbfm" Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.939245 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.948802 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-mbps4"] Mar 13 12:06:13 crc kubenswrapper[4837]: E0313 12:06:13.949263 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83f46fff-3510-4758-82a0-30099640fa33" containerName="oc" Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.949288 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f46fff-3510-4758-82a0-30099640fa33" containerName="oc" Mar 13 12:06:13 crc kubenswrapper[4837]: E0313 12:06:13.949314 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6776e647-6987-4359-baa9-14ba621118d2" containerName="ovn-config" Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.949322 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6776e647-6987-4359-baa9-14ba621118d2" containerName="ovn-config" Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.949553 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="83f46fff-3510-4758-82a0-30099640fa33" containerName="oc" Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.949580 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6776e647-6987-4359-baa9-14ba621118d2" containerName="ovn-config" Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.950242 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mbps4" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.010099 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-mbps4"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.040847 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jgwn\" (UniqueName: \"kubernetes.io/projected/6776e647-6987-4359-baa9-14ba621118d2-kube-api-access-5jgwn\") pod \"6776e647-6987-4359-baa9-14ba621118d2\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.040947 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6776e647-6987-4359-baa9-14ba621118d2-scripts\") pod \"6776e647-6987-4359-baa9-14ba621118d2\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.040977 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nqtb\" (UniqueName: \"kubernetes.io/projected/83f46fff-3510-4758-82a0-30099640fa33-kube-api-access-9nqtb\") pod \"83f46fff-3510-4758-82a0-30099640fa33\" (UID: \"83f46fff-3510-4758-82a0-30099640fa33\") " Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.041039 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-run\") pod \"6776e647-6987-4359-baa9-14ba621118d2\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.041102 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6776e647-6987-4359-baa9-14ba621118d2-additional-scripts\") pod \"6776e647-6987-4359-baa9-14ba621118d2\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.041123 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-run-ovn\") pod \"6776e647-6987-4359-baa9-14ba621118d2\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.041198 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-log-ovn\") pod \"6776e647-6987-4359-baa9-14ba621118d2\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.041421 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/685f13a4-d293-4199-8049-67b02c0162c1-operator-scripts\") pod \"cinder-db-create-mbps4\" (UID: \"685f13a4-d293-4199-8049-67b02c0162c1\") " pod="openstack/cinder-db-create-mbps4" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.041476 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpbm8\" (UniqueName: \"kubernetes.io/projected/685f13a4-d293-4199-8049-67b02c0162c1-kube-api-access-dpbm8\") pod \"cinder-db-create-mbps4\" (UID: \"685f13a4-d293-4199-8049-67b02c0162c1\") " pod="openstack/cinder-db-create-mbps4" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.042795 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6776e647-6987-4359-baa9-14ba621118d2-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "6776e647-6987-4359-baa9-14ba621118d2" (UID: "6776e647-6987-4359-baa9-14ba621118d2"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.043909 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6776e647-6987-4359-baa9-14ba621118d2-scripts" (OuterVolumeSpecName: "scripts") pod "6776e647-6987-4359-baa9-14ba621118d2" (UID: "6776e647-6987-4359-baa9-14ba621118d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.046672 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-run" (OuterVolumeSpecName: "var-run") pod "6776e647-6987-4359-baa9-14ba621118d2" (UID: "6776e647-6987-4359-baa9-14ba621118d2"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.046747 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "6776e647-6987-4359-baa9-14ba621118d2" (UID: "6776e647-6987-4359-baa9-14ba621118d2"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.046774 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "6776e647-6987-4359-baa9-14ba621118d2" (UID: "6776e647-6987-4359-baa9-14ba621118d2"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.047099 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6776e647-6987-4359-baa9-14ba621118d2-kube-api-access-5jgwn" (OuterVolumeSpecName: "kube-api-access-5jgwn") pod "6776e647-6987-4359-baa9-14ba621118d2" (UID: "6776e647-6987-4359-baa9-14ba621118d2"). InnerVolumeSpecName "kube-api-access-5jgwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.049736 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83f46fff-3510-4758-82a0-30099640fa33-kube-api-access-9nqtb" (OuterVolumeSpecName: "kube-api-access-9nqtb") pod "83f46fff-3510-4758-82a0-30099640fa33" (UID: "83f46fff-3510-4758-82a0-30099640fa33"). InnerVolumeSpecName "kube-api-access-9nqtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.173846 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-9a59-account-create-update-hqxzk"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.176171 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9a59-account-create-update-hqxzk" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.187888 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.199735 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/685f13a4-d293-4199-8049-67b02c0162c1-operator-scripts\") pod \"cinder-db-create-mbps4\" (UID: \"685f13a4-d293-4199-8049-67b02c0162c1\") " pod="openstack/cinder-db-create-mbps4" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.199925 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpbm8\" (UniqueName: \"kubernetes.io/projected/685f13a4-d293-4199-8049-67b02c0162c1-kube-api-access-dpbm8\") pod \"cinder-db-create-mbps4\" (UID: \"685f13a4-d293-4199-8049-67b02c0162c1\") " pod="openstack/cinder-db-create-mbps4" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.200225 4837 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.200240 4837 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.200258 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jgwn\" (UniqueName: \"kubernetes.io/projected/6776e647-6987-4359-baa9-14ba621118d2-kube-api-access-5jgwn\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.200272 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6776e647-6987-4359-baa9-14ba621118d2-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.200289 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nqtb\" (UniqueName: \"kubernetes.io/projected/83f46fff-3510-4758-82a0-30099640fa33-kube-api-access-9nqtb\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.200301 4837 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-run\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.200318 4837 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6776e647-6987-4359-baa9-14ba621118d2-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.206291 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/685f13a4-d293-4199-8049-67b02c0162c1-operator-scripts\") pod \"cinder-db-create-mbps4\" (UID: \"685f13a4-d293-4199-8049-67b02c0162c1\") " pod="openstack/cinder-db-create-mbps4" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.227719 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9a59-account-create-update-hqxzk"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.235658 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpbm8\" (UniqueName: \"kubernetes.io/projected/685f13a4-d293-4199-8049-67b02c0162c1-kube-api-access-dpbm8\") pod \"cinder-db-create-mbps4\" (UID: \"685f13a4-d293-4199-8049-67b02c0162c1\") " pod="openstack/cinder-db-create-mbps4" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.281331 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nbhpw-config-hrgcj" event={"ID":"6776e647-6987-4359-baa9-14ba621118d2","Type":"ContainerDied","Data":"83aae3e77d9623e024f76a9f48851e9cd8dc7696270a9bc666154f49b6134727"} Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.281383 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83aae3e77d9623e024f76a9f48851e9cd8dc7696270a9bc666154f49b6134727" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.282138 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.282375 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mbps4" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.289867 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"8eb2f07b8224427aa81b367530ce692a82952d6a8d767f3db25e035e590da8a4"} Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.289916 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"746a6c97fb0b09e268cdc6160d706911a9bd2285c1da876051d1a9a15009d4fc"} Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.289926 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"0497c856aa70619719f32b639c7ef5b1378ca0f50dd52e61ec0619c5244d0050"} Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.299013 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556726-gdbfm" event={"ID":"83f46fff-3510-4758-82a0-30099640fa33","Type":"ContainerDied","Data":"97fa7ceb35dbfdb338b3e21974c4f2697b4dc443944351f3a97c4fa67dd5702a"} Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.299052 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97fa7ceb35dbfdb338b3e21974c4f2697b4dc443944351f3a97c4fa67dd5702a" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.299136 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556726-gdbfm" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.301225 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30-operator-scripts\") pod \"cinder-9a59-account-create-update-hqxzk\" (UID: \"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30\") " pod="openstack/cinder-9a59-account-create-update-hqxzk" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.301317 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvpgs\" (UniqueName: \"kubernetes.io/projected/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30-kube-api-access-jvpgs\") pod \"cinder-9a59-account-create-update-hqxzk\" (UID: \"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30\") " pod="openstack/cinder-9a59-account-create-update-hqxzk" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.337323 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-2dlt8"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.338970 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.341671 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.342189 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-w6mdg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.342466 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.343032 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.378479 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.09037901 podStartE2EDuration="36.378460118s" podCreationTimestamp="2026-03-13 12:05:38 +0000 UTC" firstStartedPulling="2026-03-13 12:05:56.179279255 +0000 UTC m=+1071.817546018" lastFinishedPulling="2026-03-13 12:06:12.467360363 +0000 UTC m=+1088.105627126" observedRunningTime="2026-03-13 12:06:14.353554639 +0000 UTC m=+1089.991821402" watchObservedRunningTime="2026-03-13 12:06:14.378460118 +0000 UTC m=+1090.016726881" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.379381 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2dlt8"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.402293 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30-operator-scripts\") pod \"cinder-9a59-account-create-update-hqxzk\" (UID: \"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30\") " pod="openstack/cinder-9a59-account-create-update-hqxzk" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.402388 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvpgs\" (UniqueName: \"kubernetes.io/projected/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30-kube-api-access-jvpgs\") pod \"cinder-9a59-account-create-update-hqxzk\" (UID: \"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30\") " pod="openstack/cinder-9a59-account-create-update-hqxzk" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.404913 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30-operator-scripts\") pod \"cinder-9a59-account-create-update-hqxzk\" (UID: \"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30\") " pod="openstack/cinder-9a59-account-create-update-hqxzk" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.438536 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvpgs\" (UniqueName: \"kubernetes.io/projected/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30-kube-api-access-jvpgs\") pod \"cinder-9a59-account-create-update-hqxzk\" (UID: \"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30\") " pod="openstack/cinder-9a59-account-create-update-hqxzk" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.439905 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-45j5g"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.441073 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-45j5g" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.456623 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-330b-account-create-update-snkff"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.464009 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-45j5g"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.464314 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-330b-account-create-update-snkff" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.466425 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.507913 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-config-data\") pod \"keystone-db-sync-2dlt8\" (UID: \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\") " pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.508437 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-330b-account-create-update-snkff"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.509174 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9a59-account-create-update-hqxzk" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.510747 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqhqt\" (UniqueName: \"kubernetes.io/projected/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-kube-api-access-cqhqt\") pod \"keystone-db-sync-2dlt8\" (UID: \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\") " pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.510904 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-combined-ca-bundle\") pod \"keystone-db-sync-2dlt8\" (UID: \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\") " pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.511111 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l949g\" (UniqueName: \"kubernetes.io/projected/e6b37e8b-50ec-402e-ae31-27ff0d84e0be-kube-api-access-l949g\") pod \"neutron-db-create-45j5g\" (UID: \"e6b37e8b-50ec-402e-ae31-27ff0d84e0be\") " pod="openstack/neutron-db-create-45j5g" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.511192 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6b37e8b-50ec-402e-ae31-27ff0d84e0be-operator-scripts\") pod \"neutron-db-create-45j5g\" (UID: \"e6b37e8b-50ec-402e-ae31-27ff0d84e0be\") " pod="openstack/neutron-db-create-45j5g" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.612078 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l949g\" (UniqueName: \"kubernetes.io/projected/e6b37e8b-50ec-402e-ae31-27ff0d84e0be-kube-api-access-l949g\") pod \"neutron-db-create-45j5g\" (UID: \"e6b37e8b-50ec-402e-ae31-27ff0d84e0be\") " pod="openstack/neutron-db-create-45j5g" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.616299 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6b37e8b-50ec-402e-ae31-27ff0d84e0be-operator-scripts\") pod \"neutron-db-create-45j5g\" (UID: \"e6b37e8b-50ec-402e-ae31-27ff0d84e0be\") " pod="openstack/neutron-db-create-45j5g" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.616569 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-config-data\") pod \"keystone-db-sync-2dlt8\" (UID: \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\") " pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.616766 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgmcl\" (UniqueName: \"kubernetes.io/projected/77cef7b0-af86-456f-973b-923cb901b88d-kube-api-access-tgmcl\") pod \"neutron-330b-account-create-update-snkff\" (UID: \"77cef7b0-af86-456f-973b-923cb901b88d\") " pod="openstack/neutron-330b-account-create-update-snkff" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.616794 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77cef7b0-af86-456f-973b-923cb901b88d-operator-scripts\") pod \"neutron-330b-account-create-update-snkff\" (UID: \"77cef7b0-af86-456f-973b-923cb901b88d\") " pod="openstack/neutron-330b-account-create-update-snkff" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.616826 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqhqt\" (UniqueName: \"kubernetes.io/projected/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-kube-api-access-cqhqt\") pod \"keystone-db-sync-2dlt8\" (UID: \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\") " pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.616868 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-combined-ca-bundle\") pod \"keystone-db-sync-2dlt8\" (UID: \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\") " pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.617089 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6b37e8b-50ec-402e-ae31-27ff0d84e0be-operator-scripts\") pod \"neutron-db-create-45j5g\" (UID: \"e6b37e8b-50ec-402e-ae31-27ff0d84e0be\") " pod="openstack/neutron-db-create-45j5g" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.627686 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-combined-ca-bundle\") pod \"keystone-db-sync-2dlt8\" (UID: \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\") " pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.627926 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-config-data\") pod \"keystone-db-sync-2dlt8\" (UID: \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\") " pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.628685 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-g24hg"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.630182 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-g24hg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.647724 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-g24hg"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.658196 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-6b07-account-create-update-wxqsd"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.659550 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6b07-account-create-update-wxqsd" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.673077 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.679943 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqhqt\" (UniqueName: \"kubernetes.io/projected/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-kube-api-access-cqhqt\") pod \"keystone-db-sync-2dlt8\" (UID: \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\") " pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.682226 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l949g\" (UniqueName: \"kubernetes.io/projected/e6b37e8b-50ec-402e-ae31-27ff0d84e0be-kube-api-access-l949g\") pod \"neutron-db-create-45j5g\" (UID: \"e6b37e8b-50ec-402e-ae31-27ff0d84e0be\") " pod="openstack/neutron-db-create-45j5g" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.711752 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6b07-account-create-update-wxqsd"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.720808 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjmfh\" (UniqueName: \"kubernetes.io/projected/f029b52a-1a09-44b3-affe-9449cd6a5944-kube-api-access-cjmfh\") pod \"barbican-db-create-g24hg\" (UID: \"f029b52a-1a09-44b3-affe-9449cd6a5944\") " pod="openstack/barbican-db-create-g24hg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.720867 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgmcl\" (UniqueName: \"kubernetes.io/projected/77cef7b0-af86-456f-973b-923cb901b88d-kube-api-access-tgmcl\") pod \"neutron-330b-account-create-update-snkff\" (UID: \"77cef7b0-af86-456f-973b-923cb901b88d\") " pod="openstack/neutron-330b-account-create-update-snkff" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.720902 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77cef7b0-af86-456f-973b-923cb901b88d-operator-scripts\") pod \"neutron-330b-account-create-update-snkff\" (UID: \"77cef7b0-af86-456f-973b-923cb901b88d\") " pod="openstack/neutron-330b-account-create-update-snkff" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.721025 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f029b52a-1a09-44b3-affe-9449cd6a5944-operator-scripts\") pod \"barbican-db-create-g24hg\" (UID: \"f029b52a-1a09-44b3-affe-9449cd6a5944\") " pod="openstack/barbican-db-create-g24hg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.721826 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77cef7b0-af86-456f-973b-923cb901b88d-operator-scripts\") pod \"neutron-330b-account-create-update-snkff\" (UID: \"77cef7b0-af86-456f-973b-923cb901b88d\") " pod="openstack/neutron-330b-account-create-update-snkff" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.740815 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-5nlfg"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.759769 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.760747 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-45j5g" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.769637 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.786163 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgmcl\" (UniqueName: \"kubernetes.io/projected/77cef7b0-af86-456f-973b-923cb901b88d-kube-api-access-tgmcl\") pod \"neutron-330b-account-create-update-snkff\" (UID: \"77cef7b0-af86-456f-973b-923cb901b88d\") " pod="openstack/neutron-330b-account-create-update-snkff" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.801563 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-5nlfg"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.823230 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f029b52a-1a09-44b3-affe-9449cd6a5944-operator-scripts\") pod \"barbican-db-create-g24hg\" (UID: \"f029b52a-1a09-44b3-affe-9449cd6a5944\") " pod="openstack/barbican-db-create-g24hg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.823512 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjmfh\" (UniqueName: \"kubernetes.io/projected/f029b52a-1a09-44b3-affe-9449cd6a5944-kube-api-access-cjmfh\") pod \"barbican-db-create-g24hg\" (UID: \"f029b52a-1a09-44b3-affe-9449cd6a5944\") " pod="openstack/barbican-db-create-g24hg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.823698 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sgn9\" (UniqueName: \"kubernetes.io/projected/a78456e1-6f14-45d4-ab3f-1fea88af4749-kube-api-access-4sgn9\") pod \"barbican-6b07-account-create-update-wxqsd\" (UID: \"a78456e1-6f14-45d4-ab3f-1fea88af4749\") " pod="openstack/barbican-6b07-account-create-update-wxqsd" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.823813 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a78456e1-6f14-45d4-ab3f-1fea88af4749-operator-scripts\") pod \"barbican-6b07-account-create-update-wxqsd\" (UID: \"a78456e1-6f14-45d4-ab3f-1fea88af4749\") " pod="openstack/barbican-6b07-account-create-update-wxqsd" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.824028 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f029b52a-1a09-44b3-affe-9449cd6a5944-operator-scripts\") pod \"barbican-db-create-g24hg\" (UID: \"f029b52a-1a09-44b3-affe-9449cd6a5944\") " pod="openstack/barbican-db-create-g24hg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.853022 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjmfh\" (UniqueName: \"kubernetes.io/projected/f029b52a-1a09-44b3-affe-9449cd6a5944-kube-api-access-cjmfh\") pod \"barbican-db-create-g24hg\" (UID: \"f029b52a-1a09-44b3-affe-9449cd6a5944\") " pod="openstack/barbican-db-create-g24hg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.917049 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-330b-account-create-update-snkff" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.925210 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.925301 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trr9k\" (UniqueName: \"kubernetes.io/projected/1a847add-da54-4a5d-9bca-5aea455eefe8-kube-api-access-trr9k\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.925350 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sgn9\" (UniqueName: \"kubernetes.io/projected/a78456e1-6f14-45d4-ab3f-1fea88af4749-kube-api-access-4sgn9\") pod \"barbican-6b07-account-create-update-wxqsd\" (UID: \"a78456e1-6f14-45d4-ab3f-1fea88af4749\") " pod="openstack/barbican-6b07-account-create-update-wxqsd" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.925376 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a78456e1-6f14-45d4-ab3f-1fea88af4749-operator-scripts\") pod \"barbican-6b07-account-create-update-wxqsd\" (UID: \"a78456e1-6f14-45d4-ab3f-1fea88af4749\") " pod="openstack/barbican-6b07-account-create-update-wxqsd" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.925423 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-dns-svc\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.925487 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-config\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.925549 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.925570 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.927022 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a78456e1-6f14-45d4-ab3f-1fea88af4749-operator-scripts\") pod \"barbican-6b07-account-create-update-wxqsd\" (UID: \"a78456e1-6f14-45d4-ab3f-1fea88af4749\") " pod="openstack/barbican-6b07-account-create-update-wxqsd" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.950814 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sgn9\" (UniqueName: \"kubernetes.io/projected/a78456e1-6f14-45d4-ab3f-1fea88af4749-kube-api-access-4sgn9\") pod \"barbican-6b07-account-create-update-wxqsd\" (UID: \"a78456e1-6f14-45d4-ab3f-1fea88af4749\") " pod="openstack/barbican-6b07-account-create-update-wxqsd" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.986859 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.027710 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-config\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.029310 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.029345 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.029405 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.029448 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trr9k\" (UniqueName: \"kubernetes.io/projected/1a847add-da54-4a5d-9bca-5aea455eefe8-kube-api-access-trr9k\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.029518 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-dns-svc\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.031591 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-dns-svc\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.032498 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-config\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.033165 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.035174 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.035868 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.069543 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-g24hg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.102139 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556720-wqrqr"] Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.102393 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6b07-account-create-update-wxqsd" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.129475 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trr9k\" (UniqueName: \"kubernetes.io/projected/1a847add-da54-4a5d-9bca-5aea455eefe8-kube-api-access-trr9k\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.152084 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.159878 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556720-wqrqr"] Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.182869 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-mbps4"] Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.212585 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-nbhpw-config-hrgcj"] Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.228763 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-nbhpw-config-hrgcj"] Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.341227 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mbps4" event={"ID":"685f13a4-d293-4199-8049-67b02c0162c1","Type":"ContainerStarted","Data":"e1a54e4a114da1c297e0edf9ab93b2e5ab7a2495817c3040724f0929f933b467"} Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.359863 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9a59-account-create-update-hqxzk"] Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.378098 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nbhpw-config-kxsw9"] Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.380447 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.383592 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.406332 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nbhpw-config-kxsw9"] Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.459865 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-45j5g"] Mar 13 12:06:15 crc kubenswrapper[4837]: W0313 12:06:15.484380 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6b37e8b_50ec_402e_ae31_27ff0d84e0be.slice/crio-80fc33f6bc63815e77d98b7326e47c118cb3973d0734718d9a8997233d51b356 WatchSource:0}: Error finding container 80fc33f6bc63815e77d98b7326e47c118cb3973d0734718d9a8997233d51b356: Status 404 returned error can't find the container with id 80fc33f6bc63815e77d98b7326e47c118cb3973d0734718d9a8997233d51b356 Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.556901 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n59vd\" (UniqueName: \"kubernetes.io/projected/c1cab316-6ffc-483a-9c64-76be9ac13753-kube-api-access-n59vd\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.556983 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1cab316-6ffc-483a-9c64-76be9ac13753-scripts\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.557036 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-run-ovn\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.557061 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-log-ovn\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.557079 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c1cab316-6ffc-483a-9c64-76be9ac13753-additional-scripts\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.557145 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-run\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.661389 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-run\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.661935 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n59vd\" (UniqueName: \"kubernetes.io/projected/c1cab316-6ffc-483a-9c64-76be9ac13753-kube-api-access-n59vd\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.662196 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1cab316-6ffc-483a-9c64-76be9ac13753-scripts\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.662305 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-run-ovn\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.662331 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-log-ovn\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.662352 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c1cab316-6ffc-483a-9c64-76be9ac13753-additional-scripts\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.662438 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-run\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.662515 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-run-ovn\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.662553 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-log-ovn\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.663146 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c1cab316-6ffc-483a-9c64-76be9ac13753-additional-scripts\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.664627 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1cab316-6ffc-483a-9c64-76be9ac13753-scripts\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.687848 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n59vd\" (UniqueName: \"kubernetes.io/projected/c1cab316-6ffc-483a-9c64-76be9ac13753-kube-api-access-n59vd\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.702136 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-330b-account-create-update-snkff"] Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.715956 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.857917 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2dlt8"] Mar 13 12:06:15 crc kubenswrapper[4837]: W0313 12:06:15.860871 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19cfb16d_f7a7_4f5d_baa9_b00eaecf1dfe.slice/crio-396e456946df28247885135847e00125b4072fdad39608b68bcd78f42b00f1ad WatchSource:0}: Error finding container 396e456946df28247885135847e00125b4072fdad39608b68bcd78f42b00f1ad: Status 404 returned error can't find the container with id 396e456946df28247885135847e00125b4072fdad39608b68bcd78f42b00f1ad Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.953292 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6b07-account-create-update-wxqsd"] Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:15.999544 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-g24hg"] Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.044811 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-5nlfg"] Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.367171 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" event={"ID":"1a847add-da54-4a5d-9bca-5aea455eefe8","Type":"ContainerStarted","Data":"2cbead7100d7df29ad960b80cb3c7ee5eb871cec6fea242940565dd0d3726566"} Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.384364 4837 generic.go:334] "Generic (PLEG): container finished" podID="685f13a4-d293-4199-8049-67b02c0162c1" containerID="8295d45762eef27ce4120c578b478e84691da779f8c9457d397485b5b46c5eba" exitCode=0 Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.384448 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mbps4" event={"ID":"685f13a4-d293-4199-8049-67b02c0162c1","Type":"ContainerDied","Data":"8295d45762eef27ce4120c578b478e84691da779f8c9457d397485b5b46c5eba"} Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.392181 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-45j5g" event={"ID":"e6b37e8b-50ec-402e-ae31-27ff0d84e0be","Type":"ContainerStarted","Data":"286a6a1365f30df6b40943e24ec3066d64b002e22ec98bea016b42eeee5b1160"} Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.392230 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-45j5g" event={"ID":"e6b37e8b-50ec-402e-ae31-27ff0d84e0be","Type":"ContainerStarted","Data":"80fc33f6bc63815e77d98b7326e47c118cb3973d0734718d9a8997233d51b356"} Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.394701 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2dlt8" event={"ID":"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe","Type":"ContainerStarted","Data":"396e456946df28247885135847e00125b4072fdad39608b68bcd78f42b00f1ad"} Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.401895 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-g24hg" event={"ID":"f029b52a-1a09-44b3-affe-9449cd6a5944","Type":"ContainerStarted","Data":"41647603cfd4e0b54c4f06a96b6516d64d08f596f9eedd470d536cab15741d89"} Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.409382 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-330b-account-create-update-snkff" event={"ID":"77cef7b0-af86-456f-973b-923cb901b88d","Type":"ContainerStarted","Data":"42bfa52d5c8c4ce4aaf6212f222930fd5d442e727a1a7b492df691e11a1e81f6"} Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.409431 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-330b-account-create-update-snkff" event={"ID":"77cef7b0-af86-456f-973b-923cb901b88d","Type":"ContainerStarted","Data":"4d1782dc0e0e5e3ccefab2466f15606ba9a84930c835694be9443f6d78652434"} Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.413970 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9a59-account-create-update-hqxzk" event={"ID":"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30","Type":"ContainerStarted","Data":"d01d04b228faf7f13c332e53f55aacbde9b692f0da2cccf686b1a57f52fa8fe2"} Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.414000 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9a59-account-create-update-hqxzk" event={"ID":"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30","Type":"ContainerStarted","Data":"a0ee92e3c04affb9424f2cee437c2e7d68ccc6bb7f32033e929beb9ec040971c"} Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.421143 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6b07-account-create-update-wxqsd" event={"ID":"a78456e1-6f14-45d4-ab3f-1fea88af4749","Type":"ContainerStarted","Data":"9b38c974a02200392bdef3725afd6e7af730d3588ea3167463053055a9dc7c97"} Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.458488 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-g24hg" podStartSLOduration=2.458450484 podStartE2EDuration="2.458450484s" podCreationTimestamp="2026-03-13 12:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:16.43841316 +0000 UTC m=+1092.076679923" watchObservedRunningTime="2026-03-13 12:06:16.458450484 +0000 UTC m=+1092.096717247" Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.518635 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-45j5g" podStartSLOduration=2.5186004889999998 podStartE2EDuration="2.518600489s" podCreationTimestamp="2026-03-13 12:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:16.496869331 +0000 UTC m=+1092.135136094" watchObservedRunningTime="2026-03-13 12:06:16.518600489 +0000 UTC m=+1092.156867252" Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.544861 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nbhpw-config-kxsw9"] Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.571787 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-330b-account-create-update-snkff" podStartSLOduration=2.571770593 podStartE2EDuration="2.571770593s" podCreationTimestamp="2026-03-13 12:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:16.567476568 +0000 UTC m=+1092.205743331" watchObservedRunningTime="2026-03-13 12:06:16.571770593 +0000 UTC m=+1092.210037356" Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.620495 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-9a59-account-create-update-hqxzk" podStartSLOduration=2.620078764 podStartE2EDuration="2.620078764s" podCreationTimestamp="2026-03-13 12:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:16.618011899 +0000 UTC m=+1092.256278662" watchObservedRunningTime="2026-03-13 12:06:16.620078764 +0000 UTC m=+1092.258345527" Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.061750 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1335d65b-c0fb-4085-86eb-d948f797ef68" path="/var/lib/kubelet/pods/1335d65b-c0fb-4085-86eb-d948f797ef68/volumes" Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.062824 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6776e647-6987-4359-baa9-14ba621118d2" path="/var/lib/kubelet/pods/6776e647-6987-4359-baa9-14ba621118d2/volumes" Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.430784 4837 generic.go:334] "Generic (PLEG): container finished" podID="e6b37e8b-50ec-402e-ae31-27ff0d84e0be" containerID="286a6a1365f30df6b40943e24ec3066d64b002e22ec98bea016b42eeee5b1160" exitCode=0 Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.430839 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-45j5g" event={"ID":"e6b37e8b-50ec-402e-ae31-27ff0d84e0be","Type":"ContainerDied","Data":"286a6a1365f30df6b40943e24ec3066d64b002e22ec98bea016b42eeee5b1160"} Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.433234 4837 generic.go:334] "Generic (PLEG): container finished" podID="c1cab316-6ffc-483a-9c64-76be9ac13753" containerID="f8f234cd31d0132024229747ad2a8277b3ce2f09009460632455703d08203032" exitCode=0 Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.433290 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nbhpw-config-kxsw9" event={"ID":"c1cab316-6ffc-483a-9c64-76be9ac13753","Type":"ContainerDied","Data":"f8f234cd31d0132024229747ad2a8277b3ce2f09009460632455703d08203032"} Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.433307 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nbhpw-config-kxsw9" event={"ID":"c1cab316-6ffc-483a-9c64-76be9ac13753","Type":"ContainerStarted","Data":"d8e2e7d1269d007fdcb65dd9073b7a70d074c7f6f3d78656f5594e0618ab2446"} Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.436473 4837 generic.go:334] "Generic (PLEG): container finished" podID="f029b52a-1a09-44b3-affe-9449cd6a5944" containerID="4ef4f42482f9efbb7e95ba0aa3a8a4567cffbb3946a12623724cae5ed211d4e1" exitCode=0 Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.436541 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-g24hg" event={"ID":"f029b52a-1a09-44b3-affe-9449cd6a5944","Type":"ContainerDied","Data":"4ef4f42482f9efbb7e95ba0aa3a8a4567cffbb3946a12623724cae5ed211d4e1"} Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.439830 4837 generic.go:334] "Generic (PLEG): container finished" podID="77cef7b0-af86-456f-973b-923cb901b88d" containerID="42bfa52d5c8c4ce4aaf6212f222930fd5d442e727a1a7b492df691e11a1e81f6" exitCode=0 Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.439904 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-330b-account-create-update-snkff" event={"ID":"77cef7b0-af86-456f-973b-923cb901b88d","Type":"ContainerDied","Data":"42bfa52d5c8c4ce4aaf6212f222930fd5d442e727a1a7b492df691e11a1e81f6"} Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.441936 4837 generic.go:334] "Generic (PLEG): container finished" podID="72768daf-a5fa-4c8e-b9c3-49cd5f87fe30" containerID="d01d04b228faf7f13c332e53f55aacbde9b692f0da2cccf686b1a57f52fa8fe2" exitCode=0 Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.442092 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9a59-account-create-update-hqxzk" event={"ID":"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30","Type":"ContainerDied","Data":"d01d04b228faf7f13c332e53f55aacbde9b692f0da2cccf686b1a57f52fa8fe2"} Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.446469 4837 generic.go:334] "Generic (PLEG): container finished" podID="a78456e1-6f14-45d4-ab3f-1fea88af4749" containerID="fbb8d3067503d33b0b6e6a915789395c7b9c10818b3ce84f4506b15f77d6207f" exitCode=0 Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.446562 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6b07-account-create-update-wxqsd" event={"ID":"a78456e1-6f14-45d4-ab3f-1fea88af4749","Type":"ContainerDied","Data":"fbb8d3067503d33b0b6e6a915789395c7b9c10818b3ce84f4506b15f77d6207f"} Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.468817 4837 generic.go:334] "Generic (PLEG): container finished" podID="1a847add-da54-4a5d-9bca-5aea455eefe8" containerID="88f5a9c016c890932c1524d02aeb53601bb1a2cc77b41ca9cf3fabeb2713f8a0" exitCode=0 Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.469296 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" event={"ID":"1a847add-da54-4a5d-9bca-5aea455eefe8","Type":"ContainerDied","Data":"88f5a9c016c890932c1524d02aeb53601bb1a2cc77b41ca9cf3fabeb2713f8a0"} Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:17.860708 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mbps4" Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:18.030358 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/685f13a4-d293-4199-8049-67b02c0162c1-operator-scripts\") pod \"685f13a4-d293-4199-8049-67b02c0162c1\" (UID: \"685f13a4-d293-4199-8049-67b02c0162c1\") " Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:18.030558 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpbm8\" (UniqueName: \"kubernetes.io/projected/685f13a4-d293-4199-8049-67b02c0162c1-kube-api-access-dpbm8\") pod \"685f13a4-d293-4199-8049-67b02c0162c1\" (UID: \"685f13a4-d293-4199-8049-67b02c0162c1\") " Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:18.031014 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/685f13a4-d293-4199-8049-67b02c0162c1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "685f13a4-d293-4199-8049-67b02c0162c1" (UID: "685f13a4-d293-4199-8049-67b02c0162c1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:18.031373 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/685f13a4-d293-4199-8049-67b02c0162c1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:18.035404 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/685f13a4-d293-4199-8049-67b02c0162c1-kube-api-access-dpbm8" (OuterVolumeSpecName: "kube-api-access-dpbm8") pod "685f13a4-d293-4199-8049-67b02c0162c1" (UID: "685f13a4-d293-4199-8049-67b02c0162c1"). InnerVolumeSpecName "kube-api-access-dpbm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:18.135016 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpbm8\" (UniqueName: \"kubernetes.io/projected/685f13a4-d293-4199-8049-67b02c0162c1-kube-api-access-dpbm8\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:18.481178 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" event={"ID":"1a847add-da54-4a5d-9bca-5aea455eefe8","Type":"ContainerStarted","Data":"7fd2e269ac89746bd02c6eb6e013fcc551156a1538f9f4807e06a63dd46236d2"} Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:18.481805 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:18.483862 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mbps4" event={"ID":"685f13a4-d293-4199-8049-67b02c0162c1","Type":"ContainerDied","Data":"e1a54e4a114da1c297e0edf9ab93b2e5ab7a2495817c3040724f0929f933b467"} Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:18.483903 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1a54e4a114da1c297e0edf9ab93b2e5ab7a2495817c3040724f0929f933b467" Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:18.484043 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mbps4" Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:18.511576 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" podStartSLOduration=4.511560068 podStartE2EDuration="4.511560068s" podCreationTimestamp="2026-03-13 12:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:18.504754203 +0000 UTC m=+1094.143020986" watchObservedRunningTime="2026-03-13 12:06:18.511560068 +0000 UTC m=+1094.149826831" Mar 13 12:06:18 crc kubenswrapper[4837]: E0313 12:06:18.559350 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod685f13a4_d293_4199_8049_67b02c0162c1.slice/crio-e1a54e4a114da1c297e0edf9ab93b2e5ab7a2495817c3040724f0929f933b467\": RecentStats: unable to find data in memory cache]" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.446704 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-45j5g" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.455484 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9a59-account-create-update-hqxzk" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.506246 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-330b-account-create-update-snkff" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.515080 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-330b-account-create-update-snkff" event={"ID":"77cef7b0-af86-456f-973b-923cb901b88d","Type":"ContainerDied","Data":"4d1782dc0e0e5e3ccefab2466f15606ba9a84930c835694be9443f6d78652434"} Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.515114 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d1782dc0e0e5e3ccefab2466f15606ba9a84930c835694be9443f6d78652434" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.515092 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-330b-account-create-update-snkff" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.515411 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6b07-account-create-update-wxqsd" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.518969 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9a59-account-create-update-hqxzk" event={"ID":"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30","Type":"ContainerDied","Data":"a0ee92e3c04affb9424f2cee437c2e7d68ccc6bb7f32033e929beb9ec040971c"} Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.519472 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0ee92e3c04affb9424f2cee437c2e7d68ccc6bb7f32033e929beb9ec040971c" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.519594 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9a59-account-create-update-hqxzk" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.527396 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6b07-account-create-update-wxqsd" event={"ID":"a78456e1-6f14-45d4-ab3f-1fea88af4749","Type":"ContainerDied","Data":"9b38c974a02200392bdef3725afd6e7af730d3588ea3167463053055a9dc7c97"} Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.527430 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b38c974a02200392bdef3725afd6e7af730d3588ea3167463053055a9dc7c97" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.527483 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6b07-account-create-update-wxqsd" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.530627 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-45j5g" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.530728 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-45j5g" event={"ID":"e6b37e8b-50ec-402e-ae31-27ff0d84e0be","Type":"ContainerDied","Data":"80fc33f6bc63815e77d98b7326e47c118cb3973d0734718d9a8997233d51b356"} Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.530757 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80fc33f6bc63815e77d98b7326e47c118cb3973d0734718d9a8997233d51b356" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.531876 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.532603 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nbhpw-config-kxsw9" event={"ID":"c1cab316-6ffc-483a-9c64-76be9ac13753","Type":"ContainerDied","Data":"d8e2e7d1269d007fdcb65dd9073b7a70d074c7f6f3d78656f5594e0618ab2446"} Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.532621 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8e2e7d1269d007fdcb65dd9073b7a70d074c7f6f3d78656f5594e0618ab2446" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.535803 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-g24hg" event={"ID":"f029b52a-1a09-44b3-affe-9449cd6a5944","Type":"ContainerDied","Data":"41647603cfd4e0b54c4f06a96b6516d64d08f596f9eedd470d536cab15741d89"} Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.535840 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41647603cfd4e0b54c4f06a96b6516d64d08f596f9eedd470d536cab15741d89" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.549895 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-g24hg" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.613114 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6b37e8b-50ec-402e-ae31-27ff0d84e0be-operator-scripts\") pod \"e6b37e8b-50ec-402e-ae31-27ff0d84e0be\" (UID: \"e6b37e8b-50ec-402e-ae31-27ff0d84e0be\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.613386 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l949g\" (UniqueName: \"kubernetes.io/projected/e6b37e8b-50ec-402e-ae31-27ff0d84e0be-kube-api-access-l949g\") pod \"e6b37e8b-50ec-402e-ae31-27ff0d84e0be\" (UID: \"e6b37e8b-50ec-402e-ae31-27ff0d84e0be\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.613827 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6b37e8b-50ec-402e-ae31-27ff0d84e0be-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e6b37e8b-50ec-402e-ae31-27ff0d84e0be" (UID: "e6b37e8b-50ec-402e-ae31-27ff0d84e0be"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.614140 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a78456e1-6f14-45d4-ab3f-1fea88af4749-operator-scripts\") pod \"a78456e1-6f14-45d4-ab3f-1fea88af4749\" (UID: \"a78456e1-6f14-45d4-ab3f-1fea88af4749\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.614218 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgmcl\" (UniqueName: \"kubernetes.io/projected/77cef7b0-af86-456f-973b-923cb901b88d-kube-api-access-tgmcl\") pod \"77cef7b0-af86-456f-973b-923cb901b88d\" (UID: \"77cef7b0-af86-456f-973b-923cb901b88d\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.614246 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sgn9\" (UniqueName: \"kubernetes.io/projected/a78456e1-6f14-45d4-ab3f-1fea88af4749-kube-api-access-4sgn9\") pod \"a78456e1-6f14-45d4-ab3f-1fea88af4749\" (UID: \"a78456e1-6f14-45d4-ab3f-1fea88af4749\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.614269 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77cef7b0-af86-456f-973b-923cb901b88d-operator-scripts\") pod \"77cef7b0-af86-456f-973b-923cb901b88d\" (UID: \"77cef7b0-af86-456f-973b-923cb901b88d\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.614537 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30-operator-scripts\") pod \"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30\" (UID: \"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.614620 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvpgs\" (UniqueName: \"kubernetes.io/projected/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30-kube-api-access-jvpgs\") pod \"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30\" (UID: \"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.615136 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6b37e8b-50ec-402e-ae31-27ff0d84e0be-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.614620 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a78456e1-6f14-45d4-ab3f-1fea88af4749-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a78456e1-6f14-45d4-ab3f-1fea88af4749" (UID: "a78456e1-6f14-45d4-ab3f-1fea88af4749"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.615626 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "72768daf-a5fa-4c8e-b9c3-49cd5f87fe30" (UID: "72768daf-a5fa-4c8e-b9c3-49cd5f87fe30"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.616005 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77cef7b0-af86-456f-973b-923cb901b88d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "77cef7b0-af86-456f-973b-923cb901b88d" (UID: "77cef7b0-af86-456f-973b-923cb901b88d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.618243 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b37e8b-50ec-402e-ae31-27ff0d84e0be-kube-api-access-l949g" (OuterVolumeSpecName: "kube-api-access-l949g") pod "e6b37e8b-50ec-402e-ae31-27ff0d84e0be" (UID: "e6b37e8b-50ec-402e-ae31-27ff0d84e0be"). InnerVolumeSpecName "kube-api-access-l949g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.618275 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a78456e1-6f14-45d4-ab3f-1fea88af4749-kube-api-access-4sgn9" (OuterVolumeSpecName: "kube-api-access-4sgn9") pod "a78456e1-6f14-45d4-ab3f-1fea88af4749" (UID: "a78456e1-6f14-45d4-ab3f-1fea88af4749"). InnerVolumeSpecName "kube-api-access-4sgn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.619262 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77cef7b0-af86-456f-973b-923cb901b88d-kube-api-access-tgmcl" (OuterVolumeSpecName: "kube-api-access-tgmcl") pod "77cef7b0-af86-456f-973b-923cb901b88d" (UID: "77cef7b0-af86-456f-973b-923cb901b88d"). InnerVolumeSpecName "kube-api-access-tgmcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.619922 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30-kube-api-access-jvpgs" (OuterVolumeSpecName: "kube-api-access-jvpgs") pod "72768daf-a5fa-4c8e-b9c3-49cd5f87fe30" (UID: "72768daf-a5fa-4c8e-b9c3-49cd5f87fe30"). InnerVolumeSpecName "kube-api-access-jvpgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.716451 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-log-ovn\") pod \"c1cab316-6ffc-483a-9c64-76be9ac13753\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.716519 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c1cab316-6ffc-483a-9c64-76be9ac13753-additional-scripts\") pod \"c1cab316-6ffc-483a-9c64-76be9ac13753\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.716547 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c1cab316-6ffc-483a-9c64-76be9ac13753" (UID: "c1cab316-6ffc-483a-9c64-76be9ac13753"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.716560 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjmfh\" (UniqueName: \"kubernetes.io/projected/f029b52a-1a09-44b3-affe-9449cd6a5944-kube-api-access-cjmfh\") pod \"f029b52a-1a09-44b3-affe-9449cd6a5944\" (UID: \"f029b52a-1a09-44b3-affe-9449cd6a5944\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.716700 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n59vd\" (UniqueName: \"kubernetes.io/projected/c1cab316-6ffc-483a-9c64-76be9ac13753-kube-api-access-n59vd\") pod \"c1cab316-6ffc-483a-9c64-76be9ac13753\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.716777 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-run-ovn\") pod \"c1cab316-6ffc-483a-9c64-76be9ac13753\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.716869 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-run\") pod \"c1cab316-6ffc-483a-9c64-76be9ac13753\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.716886 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c1cab316-6ffc-483a-9c64-76be9ac13753" (UID: "c1cab316-6ffc-483a-9c64-76be9ac13753"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.716932 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f029b52a-1a09-44b3-affe-9449cd6a5944-operator-scripts\") pod \"f029b52a-1a09-44b3-affe-9449cd6a5944\" (UID: \"f029b52a-1a09-44b3-affe-9449cd6a5944\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.716958 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-run" (OuterVolumeSpecName: "var-run") pod "c1cab316-6ffc-483a-9c64-76be9ac13753" (UID: "c1cab316-6ffc-483a-9c64-76be9ac13753"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.716961 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1cab316-6ffc-483a-9c64-76be9ac13753-scripts\") pod \"c1cab316-6ffc-483a-9c64-76be9ac13753\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.717743 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f029b52a-1a09-44b3-affe-9449cd6a5944-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f029b52a-1a09-44b3-affe-9449cd6a5944" (UID: "f029b52a-1a09-44b3-affe-9449cd6a5944"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.717776 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sgn9\" (UniqueName: \"kubernetes.io/projected/a78456e1-6f14-45d4-ab3f-1fea88af4749-kube-api-access-4sgn9\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.717803 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77cef7b0-af86-456f-973b-923cb901b88d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.717816 4837 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.717829 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.717841 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvpgs\" (UniqueName: \"kubernetes.io/projected/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30-kube-api-access-jvpgs\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.717853 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l949g\" (UniqueName: \"kubernetes.io/projected/e6b37e8b-50ec-402e-ae31-27ff0d84e0be-kube-api-access-l949g\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.717863 4837 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.717903 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1cab316-6ffc-483a-9c64-76be9ac13753-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c1cab316-6ffc-483a-9c64-76be9ac13753" (UID: "c1cab316-6ffc-483a-9c64-76be9ac13753"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.717922 4837 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-run\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.717937 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a78456e1-6f14-45d4-ab3f-1fea88af4749-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.717951 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgmcl\" (UniqueName: \"kubernetes.io/projected/77cef7b0-af86-456f-973b-923cb901b88d-kube-api-access-tgmcl\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.718086 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1cab316-6ffc-483a-9c64-76be9ac13753-scripts" (OuterVolumeSpecName: "scripts") pod "c1cab316-6ffc-483a-9c64-76be9ac13753" (UID: "c1cab316-6ffc-483a-9c64-76be9ac13753"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.721002 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1cab316-6ffc-483a-9c64-76be9ac13753-kube-api-access-n59vd" (OuterVolumeSpecName: "kube-api-access-n59vd") pod "c1cab316-6ffc-483a-9c64-76be9ac13753" (UID: "c1cab316-6ffc-483a-9c64-76be9ac13753"). InnerVolumeSpecName "kube-api-access-n59vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.721504 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f029b52a-1a09-44b3-affe-9449cd6a5944-kube-api-access-cjmfh" (OuterVolumeSpecName: "kube-api-access-cjmfh") pod "f029b52a-1a09-44b3-affe-9449cd6a5944" (UID: "f029b52a-1a09-44b3-affe-9449cd6a5944"). InnerVolumeSpecName "kube-api-access-cjmfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.820313 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f029b52a-1a09-44b3-affe-9449cd6a5944-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.820351 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1cab316-6ffc-483a-9c64-76be9ac13753-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.820365 4837 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c1cab316-6ffc-483a-9c64-76be9ac13753-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.820377 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjmfh\" (UniqueName: \"kubernetes.io/projected/f029b52a-1a09-44b3-affe-9449cd6a5944-kube-api-access-cjmfh\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.820390 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n59vd\" (UniqueName: \"kubernetes.io/projected/c1cab316-6ffc-483a-9c64-76be9ac13753-kube-api-access-n59vd\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:22 crc kubenswrapper[4837]: I0313 12:06:22.547562 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:22 crc kubenswrapper[4837]: I0313 12:06:22.547562 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2dlt8" event={"ID":"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe","Type":"ContainerStarted","Data":"4278a43d1836aa1abbebaa7d3b0197dd5fc3373adc2b4d3124d2a223104eef56"} Mar 13 12:06:22 crc kubenswrapper[4837]: I0313 12:06:22.547628 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-g24hg" Mar 13 12:06:22 crc kubenswrapper[4837]: I0313 12:06:22.575742 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-2dlt8" podStartSLOduration=3.172906674 podStartE2EDuration="8.575720843s" podCreationTimestamp="2026-03-13 12:06:14 +0000 UTC" firstStartedPulling="2026-03-13 12:06:15.868682042 +0000 UTC m=+1091.506948805" lastFinishedPulling="2026-03-13 12:06:21.271496201 +0000 UTC m=+1096.909762974" observedRunningTime="2026-03-13 12:06:22.569579699 +0000 UTC m=+1098.207846472" watchObservedRunningTime="2026-03-13 12:06:22.575720843 +0000 UTC m=+1098.213987606" Mar 13 12:06:22 crc kubenswrapper[4837]: I0313 12:06:22.643497 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-nbhpw-config-kxsw9"] Mar 13 12:06:22 crc kubenswrapper[4837]: I0313 12:06:22.655292 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-nbhpw-config-kxsw9"] Mar 13 12:06:23 crc kubenswrapper[4837]: I0313 12:06:23.059433 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1cab316-6ffc-483a-9c64-76be9ac13753" path="/var/lib/kubelet/pods/c1cab316-6ffc-483a-9c64-76be9ac13753/volumes" Mar 13 12:06:23 crc kubenswrapper[4837]: I0313 12:06:23.161772 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:06:24 crc kubenswrapper[4837]: I0313 12:06:24.564971 4837 generic.go:334] "Generic (PLEG): container finished" podID="19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe" containerID="4278a43d1836aa1abbebaa7d3b0197dd5fc3373adc2b4d3124d2a223104eef56" exitCode=0 Mar 13 12:06:24 crc kubenswrapper[4837]: I0313 12:06:24.565056 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2dlt8" event={"ID":"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe","Type":"ContainerDied","Data":"4278a43d1836aa1abbebaa7d3b0197dd5fc3373adc2b4d3124d2a223104eef56"} Mar 13 12:06:24 crc kubenswrapper[4837]: E0313 12:06:24.866241 4837 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.138:45154->38.102.83.138:43005: read tcp 38.102.83.138:45154->38.102.83.138:43005: read: connection reset by peer Mar 13 12:06:24 crc kubenswrapper[4837]: E0313 12:06:24.866258 4837 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.138:45154->38.102.83.138:43005: write tcp 38.102.83.138:45154->38.102.83.138:43005: write: broken pipe Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.153806 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.207211 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gqrt7"] Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.207426 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-gqrt7" podUID="de68f8fe-0650-4ef4-9445-d31e119de423" containerName="dnsmasq-dns" containerID="cri-o://a3d9d75be9f89d9ac614473e4e3a4f535965320bd55937576eb6b69f6cb8f8b9" gracePeriod=10 Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.578960 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jkthw" event={"ID":"b4490fb3-45d7-4b40-ad34-5bf33ba88491","Type":"ContainerStarted","Data":"483a91e4e8aeb62a4bc9d00fab2fa3f3452e90337b10ae7eb6d6d40d39b495c8"} Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.586667 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gqrt7" event={"ID":"de68f8fe-0650-4ef4-9445-d31e119de423","Type":"ContainerDied","Data":"a3d9d75be9f89d9ac614473e4e3a4f535965320bd55937576eb6b69f6cb8f8b9"} Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.586677 4837 generic.go:334] "Generic (PLEG): container finished" podID="de68f8fe-0650-4ef4-9445-d31e119de423" containerID="a3d9d75be9f89d9ac614473e4e3a4f535965320bd55937576eb6b69f6cb8f8b9" exitCode=0 Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.586781 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gqrt7" event={"ID":"de68f8fe-0650-4ef4-9445-d31e119de423","Type":"ContainerDied","Data":"d23a17995d98b2790b62117dc60f3874a46893982c985ce77e930333e0f2f46d"} Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.586802 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d23a17995d98b2790b62117dc60f3874a46893982c985ce77e930333e0f2f46d" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.606449 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-jkthw" podStartSLOduration=2.611277292 podStartE2EDuration="34.606428233s" podCreationTimestamp="2026-03-13 12:05:51 +0000 UTC" firstStartedPulling="2026-03-13 12:05:52.490668946 +0000 UTC m=+1068.128935709" lastFinishedPulling="2026-03-13 12:06:24.485819887 +0000 UTC m=+1100.124086650" observedRunningTime="2026-03-13 12:06:25.598694669 +0000 UTC m=+1101.236961432" watchObservedRunningTime="2026-03-13 12:06:25.606428233 +0000 UTC m=+1101.244694996" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.646663 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.684260 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-dns-svc\") pod \"de68f8fe-0650-4ef4-9445-d31e119de423\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.684418 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq8p7\" (UniqueName: \"kubernetes.io/projected/de68f8fe-0650-4ef4-9445-d31e119de423-kube-api-access-dq8p7\") pod \"de68f8fe-0650-4ef4-9445-d31e119de423\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.684498 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-ovsdbserver-nb\") pod \"de68f8fe-0650-4ef4-9445-d31e119de423\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.684559 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-config\") pod \"de68f8fe-0650-4ef4-9445-d31e119de423\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.684597 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-ovsdbserver-sb\") pod \"de68f8fe-0650-4ef4-9445-d31e119de423\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.703311 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de68f8fe-0650-4ef4-9445-d31e119de423-kube-api-access-dq8p7" (OuterVolumeSpecName: "kube-api-access-dq8p7") pod "de68f8fe-0650-4ef4-9445-d31e119de423" (UID: "de68f8fe-0650-4ef4-9445-d31e119de423"). InnerVolumeSpecName "kube-api-access-dq8p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.754555 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-config" (OuterVolumeSpecName: "config") pod "de68f8fe-0650-4ef4-9445-d31e119de423" (UID: "de68f8fe-0650-4ef4-9445-d31e119de423"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.764143 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "de68f8fe-0650-4ef4-9445-d31e119de423" (UID: "de68f8fe-0650-4ef4-9445-d31e119de423"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.769688 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "de68f8fe-0650-4ef4-9445-d31e119de423" (UID: "de68f8fe-0650-4ef4-9445-d31e119de423"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.781235 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "de68f8fe-0650-4ef4-9445-d31e119de423" (UID: "de68f8fe-0650-4ef4-9445-d31e119de423"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.791168 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.791212 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq8p7\" (UniqueName: \"kubernetes.io/projected/de68f8fe-0650-4ef4-9445-d31e119de423-kube-api-access-dq8p7\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.791226 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.791237 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.791245 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.890761 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.993089 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqhqt\" (UniqueName: \"kubernetes.io/projected/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-kube-api-access-cqhqt\") pod \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\" (UID: \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\") " Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.993201 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-config-data\") pod \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\" (UID: \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\") " Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.993291 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-combined-ca-bundle\") pod \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\" (UID: \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\") " Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.997923 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-kube-api-access-cqhqt" (OuterVolumeSpecName: "kube-api-access-cqhqt") pod "19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe" (UID: "19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe"). InnerVolumeSpecName "kube-api-access-cqhqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.039939 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe" (UID: "19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.044761 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-config-data" (OuterVolumeSpecName: "config-data") pod "19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe" (UID: "19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.095444 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqhqt\" (UniqueName: \"kubernetes.io/projected/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-kube-api-access-cqhqt\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.095503 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.095514 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.596695 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.596700 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2dlt8" event={"ID":"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe","Type":"ContainerDied","Data":"396e456946df28247885135847e00125b4072fdad39608b68bcd78f42b00f1ad"} Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.596735 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.596750 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="396e456946df28247885135847e00125b4072fdad39608b68bcd78f42b00f1ad" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.636277 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gqrt7"] Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.642252 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gqrt7"] Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.841111 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-x64d8"] Mar 13 12:06:26 crc kubenswrapper[4837]: E0313 12:06:26.841712 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="685f13a4-d293-4199-8049-67b02c0162c1" containerName="mariadb-database-create" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.841734 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="685f13a4-d293-4199-8049-67b02c0162c1" containerName="mariadb-database-create" Mar 13 12:06:26 crc kubenswrapper[4837]: E0313 12:06:26.841753 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de68f8fe-0650-4ef4-9445-d31e119de423" containerName="init" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.841760 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="de68f8fe-0650-4ef4-9445-d31e119de423" containerName="init" Mar 13 12:06:26 crc kubenswrapper[4837]: E0313 12:06:26.841772 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b37e8b-50ec-402e-ae31-27ff0d84e0be" containerName="mariadb-database-create" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.841779 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b37e8b-50ec-402e-ae31-27ff0d84e0be" containerName="mariadb-database-create" Mar 13 12:06:26 crc kubenswrapper[4837]: E0313 12:06:26.841792 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a78456e1-6f14-45d4-ab3f-1fea88af4749" containerName="mariadb-account-create-update" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.841802 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a78456e1-6f14-45d4-ab3f-1fea88af4749" containerName="mariadb-account-create-update" Mar 13 12:06:26 crc kubenswrapper[4837]: E0313 12:06:26.841814 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe" containerName="keystone-db-sync" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.841820 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe" containerName="keystone-db-sync" Mar 13 12:06:26 crc kubenswrapper[4837]: E0313 12:06:26.841832 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de68f8fe-0650-4ef4-9445-d31e119de423" containerName="dnsmasq-dns" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.841840 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="de68f8fe-0650-4ef4-9445-d31e119de423" containerName="dnsmasq-dns" Mar 13 12:06:26 crc kubenswrapper[4837]: E0313 12:06:26.841863 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1cab316-6ffc-483a-9c64-76be9ac13753" containerName="ovn-config" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.841873 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1cab316-6ffc-483a-9c64-76be9ac13753" containerName="ovn-config" Mar 13 12:06:26 crc kubenswrapper[4837]: E0313 12:06:26.841882 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72768daf-a5fa-4c8e-b9c3-49cd5f87fe30" containerName="mariadb-account-create-update" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.841888 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="72768daf-a5fa-4c8e-b9c3-49cd5f87fe30" containerName="mariadb-account-create-update" Mar 13 12:06:26 crc kubenswrapper[4837]: E0313 12:06:26.841906 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77cef7b0-af86-456f-973b-923cb901b88d" containerName="mariadb-account-create-update" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.841912 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="77cef7b0-af86-456f-973b-923cb901b88d" containerName="mariadb-account-create-update" Mar 13 12:06:26 crc kubenswrapper[4837]: E0313 12:06:26.841920 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f029b52a-1a09-44b3-affe-9449cd6a5944" containerName="mariadb-database-create" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.841927 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f029b52a-1a09-44b3-affe-9449cd6a5944" containerName="mariadb-database-create" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.842111 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1cab316-6ffc-483a-9c64-76be9ac13753" containerName="ovn-config" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.842127 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a78456e1-6f14-45d4-ab3f-1fea88af4749" containerName="mariadb-account-create-update" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.842139 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f029b52a-1a09-44b3-affe-9449cd6a5944" containerName="mariadb-database-create" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.842146 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="de68f8fe-0650-4ef4-9445-d31e119de423" containerName="dnsmasq-dns" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.842158 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe" containerName="keystone-db-sync" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.842168 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="72768daf-a5fa-4c8e-b9c3-49cd5f87fe30" containerName="mariadb-account-create-update" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.842182 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="77cef7b0-af86-456f-973b-923cb901b88d" containerName="mariadb-account-create-update" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.842190 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b37e8b-50ec-402e-ae31-27ff0d84e0be" containerName="mariadb-database-create" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.842197 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="685f13a4-d293-4199-8049-67b02c0162c1" containerName="mariadb-database-create" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.843600 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.858329 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-x64d8"] Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.906715 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-kz7j9"] Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.908275 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.910372 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.912328 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.912658 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.912873 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-w6mdg" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.918451 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.935444 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kz7j9"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.017048 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.017091 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-scripts\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.017111 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-fernet-keys\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.017127 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pj7d\" (UniqueName: \"kubernetes.io/projected/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-kube-api-access-8pj7d\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.017148 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.017168 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-config\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.017182 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-dns-svc\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.017218 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.017246 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-config-data\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.017269 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsj8v\" (UniqueName: \"kubernetes.io/projected/3dec188c-ab95-4544-ac61-6f435f830f97-kube-api-access-gsj8v\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.017309 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-credential-keys\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.017324 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-combined-ca-bundle\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.091000 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de68f8fe-0650-4ef4-9445-d31e119de423" path="/var/lib/kubelet/pods/de68f8fe-0650-4ef4-9445-d31e119de423/volumes" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.101059 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6b5f9b5c85-p584g"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.102413 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.108531 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.108743 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.108872 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-4srx9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.109040 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.119353 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsj8v\" (UniqueName: \"kubernetes.io/projected/3dec188c-ab95-4544-ac61-6f435f830f97-kube-api-access-gsj8v\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.119419 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-credential-keys\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.119441 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-combined-ca-bundle\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.119494 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.119513 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-fernet-keys\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.119526 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-scripts\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.119542 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pj7d\" (UniqueName: \"kubernetes.io/projected/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-kube-api-access-8pj7d\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.119561 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.119580 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-config\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.119595 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-dns-svc\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.119629 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.119673 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-config-data\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.121993 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.121993 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-config\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.122165 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.122456 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-dns-svc\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.122757 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.125729 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-credential-keys\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.131015 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-combined-ca-bundle\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.135962 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-fernet-keys\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.138707 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-config-data\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.158694 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-scripts\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.159271 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pj7d\" (UniqueName: \"kubernetes.io/projected/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-kube-api-access-8pj7d\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.168899 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsj8v\" (UniqueName: \"kubernetes.io/projected/3dec188c-ab95-4544-ac61-6f435f830f97-kube-api-access-gsj8v\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.169378 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.186270 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6b5f9b5c85-p584g"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.222496 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1f2afb5c-bfb2-4349-8000-4c0c90892d56-horizon-secret-key\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.222528 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scc7s\" (UniqueName: \"kubernetes.io/projected/1f2afb5c-bfb2-4349-8000-4c0c90892d56-kube-api-access-scc7s\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.222548 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f2afb5c-bfb2-4349-8000-4c0c90892d56-logs\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.222603 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1f2afb5c-bfb2-4349-8000-4c0c90892d56-config-data\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.222646 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f2afb5c-bfb2-4349-8000-4c0c90892d56-scripts\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.229987 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.235141 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.249322 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.254333 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.254500 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.254592 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.268405 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-wdwg2"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.269700 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.285131 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.285315 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-88ssc" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.285349 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.324874 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.325317 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.325360 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r7pn\" (UniqueName: \"kubernetes.io/projected/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-kube-api-access-6r7pn\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.325522 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-scripts\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.325600 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1f2afb5c-bfb2-4349-8000-4c0c90892d56-horizon-secret-key\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.325630 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-config-data\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.325661 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-log-httpd\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.325686 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scc7s\" (UniqueName: \"kubernetes.io/projected/1f2afb5c-bfb2-4349-8000-4c0c90892d56-kube-api-access-scc7s\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.325717 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f2afb5c-bfb2-4349-8000-4c0c90892d56-logs\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.325830 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-run-httpd\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.325871 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1f2afb5c-bfb2-4349-8000-4c0c90892d56-config-data\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.325936 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f2afb5c-bfb2-4349-8000-4c0c90892d56-scripts\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.326837 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f2afb5c-bfb2-4349-8000-4c0c90892d56-scripts\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.328817 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f2afb5c-bfb2-4349-8000-4c0c90892d56-logs\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.329758 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1f2afb5c-bfb2-4349-8000-4c0c90892d56-config-data\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.354712 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1f2afb5c-bfb2-4349-8000-4c0c90892d56-horizon-secret-key\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.371184 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-b6qnm"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.373550 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.381121 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.382860 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ktqxm" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.391872 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scc7s\" (UniqueName: \"kubernetes.io/projected/1f2afb5c-bfb2-4349-8000-4c0c90892d56-kube-api-access-scc7s\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.442062 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-config-data\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.442102 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-log-httpd\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.442162 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-run-httpd\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.442210 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq9wd\" (UniqueName: \"kubernetes.io/projected/d2d0a770-288f-40d8-832e-f5463863bef1-kube-api-access-qq9wd\") pod \"neutron-db-sync-wdwg2\" (UID: \"d2d0a770-288f-40d8-832e-f5463863bef1\") " pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.442255 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.442282 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2d0a770-288f-40d8-832e-f5463863bef1-combined-ca-bundle\") pod \"neutron-db-sync-wdwg2\" (UID: \"d2d0a770-288f-40d8-832e-f5463863bef1\") " pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.442321 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.442341 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r7pn\" (UniqueName: \"kubernetes.io/projected/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-kube-api-access-6r7pn\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.442373 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d2d0a770-288f-40d8-832e-f5463863bef1-config\") pod \"neutron-db-sync-wdwg2\" (UID: \"d2d0a770-288f-40d8-832e-f5463863bef1\") " pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.442396 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-scripts\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.450923 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-log-httpd\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.450980 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-qdzjz"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.451310 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-run-httpd\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.452035 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.458062 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.458369 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.462563 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-config-data\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.463506 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-scripts\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.464329 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-klrh4" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.466929 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.467109 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.481430 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r7pn\" (UniqueName: \"kubernetes.io/projected/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-kube-api-access-6r7pn\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.482704 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-wdwg2"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.493171 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-b6qnm"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.501048 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-qdzjz"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.510014 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-8vx8g"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.511293 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.518100 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.518300 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-s6tq2" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.518414 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.518788 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-x64d8"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.529083 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8vx8g"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.547059 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq9wd\" (UniqueName: \"kubernetes.io/projected/d2d0a770-288f-40d8-832e-f5463863bef1-kube-api-access-qq9wd\") pod \"neutron-db-sync-wdwg2\" (UID: \"d2d0a770-288f-40d8-832e-f5463863bef1\") " pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.547100 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-combined-ca-bundle\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.547128 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-scripts\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.547155 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p5k2\" (UniqueName: \"kubernetes.io/projected/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-kube-api-access-4p5k2\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.547178 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2d0a770-288f-40d8-832e-f5463863bef1-combined-ca-bundle\") pod \"neutron-db-sync-wdwg2\" (UID: \"d2d0a770-288f-40d8-832e-f5463863bef1\") " pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.547203 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95b808e7-674f-4592-af6e-f7c8682f6a17-combined-ca-bundle\") pod \"barbican-db-sync-b6qnm\" (UID: \"95b808e7-674f-4592-af6e-f7c8682f6a17\") " pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.547221 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sh7t\" (UniqueName: \"kubernetes.io/projected/95b808e7-674f-4592-af6e-f7c8682f6a17-kube-api-access-9sh7t\") pod \"barbican-db-sync-b6qnm\" (UID: \"95b808e7-674f-4592-af6e-f7c8682f6a17\") " pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.547238 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-config-data\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.547275 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d2d0a770-288f-40d8-832e-f5463863bef1-config\") pod \"neutron-db-sync-wdwg2\" (UID: \"d2d0a770-288f-40d8-832e-f5463863bef1\") " pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.547293 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-db-sync-config-data\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.547308 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/95b808e7-674f-4592-af6e-f7c8682f6a17-db-sync-config-data\") pod \"barbican-db-sync-b6qnm\" (UID: \"95b808e7-674f-4592-af6e-f7c8682f6a17\") " pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.547324 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-etc-machine-id\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.551687 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2d0a770-288f-40d8-832e-f5463863bef1-combined-ca-bundle\") pod \"neutron-db-sync-wdwg2\" (UID: \"d2d0a770-288f-40d8-832e-f5463863bef1\") " pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.566999 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d2d0a770-288f-40d8-832e-f5463863bef1-config\") pod \"neutron-db-sync-wdwg2\" (UID: \"d2d0a770-288f-40d8-832e-f5463863bef1\") " pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.577870 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq9wd\" (UniqueName: \"kubernetes.io/projected/d2d0a770-288f-40d8-832e-f5463863bef1-kube-api-access-qq9wd\") pod \"neutron-db-sync-wdwg2\" (UID: \"d2d0a770-288f-40d8-832e-f5463863bef1\") " pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.604301 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-7kqcz"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.606300 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.638547 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7f5fbb4dd7-wcbjd"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.640579 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.649750 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p5k2\" (UniqueName: \"kubernetes.io/projected/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-kube-api-access-4p5k2\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.649816 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-logs\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.649842 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95b808e7-674f-4592-af6e-f7c8682f6a17-combined-ca-bundle\") pod \"barbican-db-sync-b6qnm\" (UID: \"95b808e7-674f-4592-af6e-f7c8682f6a17\") " pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.649858 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sh7t\" (UniqueName: \"kubernetes.io/projected/95b808e7-674f-4592-af6e-f7c8682f6a17-kube-api-access-9sh7t\") pod \"barbican-db-sync-b6qnm\" (UID: \"95b808e7-674f-4592-af6e-f7c8682f6a17\") " pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.649875 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-config-data\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.649914 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-scripts\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.649942 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-db-sync-config-data\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.649960 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/95b808e7-674f-4592-af6e-f7c8682f6a17-db-sync-config-data\") pod \"barbican-db-sync-b6qnm\" (UID: \"95b808e7-674f-4592-af6e-f7c8682f6a17\") " pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.649978 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-etc-machine-id\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.649994 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-combined-ca-bundle\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.650080 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vtt2\" (UniqueName: \"kubernetes.io/projected/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-kube-api-access-2vtt2\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.650116 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-combined-ca-bundle\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.650136 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-config-data\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.650160 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-scripts\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.650496 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-etc-machine-id\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.653443 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/95b808e7-674f-4592-af6e-f7c8682f6a17-db-sync-config-data\") pod \"barbican-db-sync-b6qnm\" (UID: \"95b808e7-674f-4592-af6e-f7c8682f6a17\") " pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.654038 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-config-data\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.654522 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.659564 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-combined-ca-bundle\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.667692 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-7kqcz"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.676358 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-scripts\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.679507 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sh7t\" (UniqueName: \"kubernetes.io/projected/95b808e7-674f-4592-af6e-f7c8682f6a17-kube-api-access-9sh7t\") pod \"barbican-db-sync-b6qnm\" (UID: \"95b808e7-674f-4592-af6e-f7c8682f6a17\") " pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.680440 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p5k2\" (UniqueName: \"kubernetes.io/projected/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-kube-api-access-4p5k2\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.683214 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f5fbb4dd7-wcbjd"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.683383 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95b808e7-674f-4592-af6e-f7c8682f6a17-combined-ca-bundle\") pod \"barbican-db-sync-b6qnm\" (UID: \"95b808e7-674f-4592-af6e-f7c8682f6a17\") " pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.689501 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.710988 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-db-sync-config-data\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.751940 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752001 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-scripts\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752039 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752070 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-combined-ca-bundle\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752096 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmp8d\" (UniqueName: \"kubernetes.io/projected/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-kube-api-access-nmp8d\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752142 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-scripts\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752169 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-config-data\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752215 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-horizon-secret-key\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752243 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vtt2\" (UniqueName: \"kubernetes.io/projected/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-kube-api-access-2vtt2\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752264 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-logs\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752308 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-config-data\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752360 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr9qj\" (UniqueName: \"kubernetes.io/projected/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-kube-api-access-jr9qj\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752385 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-config\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752406 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752433 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-logs\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752457 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.753119 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-logs\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.758465 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-combined-ca-bundle\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.768813 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-config-data\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.769002 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-scripts\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.771084 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vtt2\" (UniqueName: \"kubernetes.io/projected/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-kube-api-access-2vtt2\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.790153 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.801064 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.815009 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.861141 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.862508 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-horizon-secret-key\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.862542 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-logs\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.862602 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr9qj\" (UniqueName: \"kubernetes.io/projected/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-kube-api-access-jr9qj\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.862628 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-config\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.862665 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.862689 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.862728 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.862764 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.862797 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmp8d\" (UniqueName: \"kubernetes.io/projected/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-kube-api-access-nmp8d\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.862837 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-scripts\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.862860 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-config-data\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.865841 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-logs\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.865956 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.866869 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-config-data\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.867556 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-scripts\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.868289 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.868716 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.869074 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-config\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.870200 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.889184 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmp8d\" (UniqueName: \"kubernetes.io/projected/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-kube-api-access-nmp8d\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.890021 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr9qj\" (UniqueName: \"kubernetes.io/projected/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-kube-api-access-jr9qj\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.892079 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-horizon-secret-key\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.906680 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-x64d8"] Mar 13 12:06:27 crc kubenswrapper[4837]: W0313 12:06:27.961246 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64ff6ef1_7035_4f8e_8ee7_d0b858c92459.slice/crio-ed6fbc437d5476a19d20f53afd28bda92c212f6e9ba02f54fb6312bfa2653c75 WatchSource:0}: Error finding container ed6fbc437d5476a19d20f53afd28bda92c212f6e9ba02f54fb6312bfa2653c75: Status 404 returned error can't find the container with id ed6fbc437d5476a19d20f53afd28bda92c212f6e9ba02f54fb6312bfa2653c75 Mar 13 12:06:28 crc kubenswrapper[4837]: W0313 12:06:28.062440 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dec188c_ab95_4544_ac61_6f435f830f97.slice/crio-a81b99f16a86752e1a0e2bc8e53026978a648af00da7f92ffdb871f715779f44 WatchSource:0}: Error finding container a81b99f16a86752e1a0e2bc8e53026978a648af00da7f92ffdb871f715779f44: Status 404 returned error can't find the container with id a81b99f16a86752e1a0e2bc8e53026978a648af00da7f92ffdb871f715779f44 Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.064151 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kz7j9"] Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.165901 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.184108 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.300703 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.431451 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6b5f9b5c85-p584g"] Mar 13 12:06:28 crc kubenswrapper[4837]: W0313 12:06:28.447569 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f2afb5c_bfb2_4349_8000_4c0c90892d56.slice/crio-18955fe5d50dd684cdf6370fe66929e8470b6ef70461302b53ed56fa3595256c WatchSource:0}: Error finding container 18955fe5d50dd684cdf6370fe66929e8470b6ef70461302b53ed56fa3595256c: Status 404 returned error can't find the container with id 18955fe5d50dd684cdf6370fe66929e8470b6ef70461302b53ed56fa3595256c Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.652839 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-wdwg2"] Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.670403 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kz7j9" event={"ID":"3dec188c-ab95-4544-ac61-6f435f830f97","Type":"ContainerStarted","Data":"40a0292d1dfe433f0d44082f12bb7e30ff5d447f6e395b1d7a570420f6252eeb"} Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.670465 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kz7j9" event={"ID":"3dec188c-ab95-4544-ac61-6f435f830f97","Type":"ContainerStarted","Data":"a81b99f16a86752e1a0e2bc8e53026978a648af00da7f92ffdb871f715779f44"} Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.674419 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee","Type":"ContainerStarted","Data":"2fe508c1e7b8efe966205eebb2665129d9e9d777f425ce11141b713c93504dc7"} Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.676331 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b5f9b5c85-p584g" event={"ID":"1f2afb5c-bfb2-4349-8000-4c0c90892d56","Type":"ContainerStarted","Data":"18955fe5d50dd684cdf6370fe66929e8470b6ef70461302b53ed56fa3595256c"} Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.678392 4837 generic.go:334] "Generic (PLEG): container finished" podID="64ff6ef1-7035-4f8e-8ee7-d0b858c92459" containerID="1b9a20436328083d120b6ff2c35a76b78305ceb579a5babd22ad7d00d1dc8340" exitCode=0 Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.678462 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-x64d8" event={"ID":"64ff6ef1-7035-4f8e-8ee7-d0b858c92459","Type":"ContainerDied","Data":"1b9a20436328083d120b6ff2c35a76b78305ceb579a5babd22ad7d00d1dc8340"} Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.678543 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-x64d8" event={"ID":"64ff6ef1-7035-4f8e-8ee7-d0b858c92459","Type":"ContainerStarted","Data":"ed6fbc437d5476a19d20f53afd28bda92c212f6e9ba02f54fb6312bfa2653c75"} Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.717567 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-kz7j9" podStartSLOduration=2.71754906 podStartE2EDuration="2.71754906s" podCreationTimestamp="2026-03-13 12:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:28.709516936 +0000 UTC m=+1104.347783699" watchObservedRunningTime="2026-03-13 12:06:28.71754906 +0000 UTC m=+1104.355815823" Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.836136 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-qdzjz"] Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.844359 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8vx8g"] Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.852321 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-b6qnm"] Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.983685 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f5fbb4dd7-wcbjd"] Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.000060 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6b5f9b5c85-p584g"] Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.098327 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-c6787dc45-zbdfx"] Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.099991 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.148461 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-7kqcz"] Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.155161 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-c6787dc45-zbdfx"] Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.194211 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-horizon-secret-key\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.194389 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v796v\" (UniqueName: \"kubernetes.io/projected/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-kube-api-access-v796v\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.194543 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-logs\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.194667 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-config-data\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.194695 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-scripts\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.230257 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.235891 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.295845 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-horizon-secret-key\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.295936 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v796v\" (UniqueName: \"kubernetes.io/projected/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-kube-api-access-v796v\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.296007 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-logs\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.296058 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-config-data\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.296084 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-scripts\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.296784 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-logs\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.296841 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-scripts\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.297612 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-config-data\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.300764 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-horizon-secret-key\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.321376 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v796v\" (UniqueName: \"kubernetes.io/projected/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-kube-api-access-v796v\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.398304 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-ovsdbserver-nb\") pod \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.398686 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-dns-svc\") pod \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.398722 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-dns-swift-storage-0\") pod \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.398814 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-config\") pod \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.398854 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pj7d\" (UniqueName: \"kubernetes.io/projected/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-kube-api-access-8pj7d\") pod \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.398889 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-ovsdbserver-sb\") pod \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.407968 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-kube-api-access-8pj7d" (OuterVolumeSpecName: "kube-api-access-8pj7d") pod "64ff6ef1-7035-4f8e-8ee7-d0b858c92459" (UID: "64ff6ef1-7035-4f8e-8ee7-d0b858c92459"). InnerVolumeSpecName "kube-api-access-8pj7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.425675 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "64ff6ef1-7035-4f8e-8ee7-d0b858c92459" (UID: "64ff6ef1-7035-4f8e-8ee7-d0b858c92459"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.427229 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "64ff6ef1-7035-4f8e-8ee7-d0b858c92459" (UID: "64ff6ef1-7035-4f8e-8ee7-d0b858c92459"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.435248 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "64ff6ef1-7035-4f8e-8ee7-d0b858c92459" (UID: "64ff6ef1-7035-4f8e-8ee7-d0b858c92459"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.438599 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "64ff6ef1-7035-4f8e-8ee7-d0b858c92459" (UID: "64ff6ef1-7035-4f8e-8ee7-d0b858c92459"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.441443 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.454919 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-config" (OuterVolumeSpecName: "config") pod "64ff6ef1-7035-4f8e-8ee7-d0b858c92459" (UID: "64ff6ef1-7035-4f8e-8ee7-d0b858c92459"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.501377 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.501421 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.501435 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pj7d\" (UniqueName: \"kubernetes.io/projected/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-kube-api-access-8pj7d\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.501491 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.501509 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.501521 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.701173 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-b6qnm" event={"ID":"95b808e7-674f-4592-af6e-f7c8682f6a17","Type":"ContainerStarted","Data":"02cdc5326e2dbc385d4e7090105a3655b6651929ef4db12950f0c379aaf98274"} Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.702758 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8vx8g" event={"ID":"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76","Type":"ContainerStarted","Data":"d8d4fa30fd1f227e47a679c4ebd48ddee761f9902a8c45ed343c205dc3f7e3b1"} Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.704564 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-x64d8" event={"ID":"64ff6ef1-7035-4f8e-8ee7-d0b858c92459","Type":"ContainerDied","Data":"ed6fbc437d5476a19d20f53afd28bda92c212f6e9ba02f54fb6312bfa2653c75"} Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.704589 4837 scope.go:117] "RemoveContainer" containerID="1b9a20436328083d120b6ff2c35a76b78305ceb579a5babd22ad7d00d1dc8340" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.704728 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.708559 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qdzjz" event={"ID":"a44db1d6-6da2-41a5-a37f-ffc602f0d55a","Type":"ContainerStarted","Data":"d87408c4f80f070da48980a1c0c42ec26d6e0f566d37471876ae97d32157796e"} Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.712021 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wdwg2" event={"ID":"d2d0a770-288f-40d8-832e-f5463863bef1","Type":"ContainerStarted","Data":"167d2264a85f4435f333e5de927afa95b020419521d018ef924666fe1959c6ff"} Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.712055 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wdwg2" event={"ID":"d2d0a770-288f-40d8-832e-f5463863bef1","Type":"ContainerStarted","Data":"20eddadf1412bdac7244116ec35325dbc4b45413968aa761e6fe806d93d5742c"} Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.715606 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f5fbb4dd7-wcbjd" event={"ID":"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2","Type":"ContainerStarted","Data":"041964cbbb19e51e7b1a85074982a092d132a78534f19eb3616f99ddc8aa3e15"} Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.720307 4837 generic.go:334] "Generic (PLEG): container finished" podID="306aa5e9-7f77-4ff8-9cf6-5b3255c85337" containerID="a952d72f45aa2f65e1c6c2e7322bdcf16fb7324473b881a19caa21b16b66a760" exitCode=0 Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.721937 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" event={"ID":"306aa5e9-7f77-4ff8-9cf6-5b3255c85337","Type":"ContainerDied","Data":"a952d72f45aa2f65e1c6c2e7322bdcf16fb7324473b881a19caa21b16b66a760"} Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.721971 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" event={"ID":"306aa5e9-7f77-4ff8-9cf6-5b3255c85337","Type":"ContainerStarted","Data":"2a0f4fde059e2510bd13af9a796ea4745ff14474c756cbe8a1063e240ce40a71"} Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.726919 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-wdwg2" podStartSLOduration=2.726902056 podStartE2EDuration="2.726902056s" podCreationTimestamp="2026-03-13 12:06:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:29.726583876 +0000 UTC m=+1105.364850639" watchObservedRunningTime="2026-03-13 12:06:29.726902056 +0000 UTC m=+1105.365168819" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.803391 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-x64d8"] Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.823492 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-x64d8"] Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.955286 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-c6787dc45-zbdfx"] Mar 13 12:06:30 crc kubenswrapper[4837]: I0313 12:06:30.763777 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c6787dc45-zbdfx" event={"ID":"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14","Type":"ContainerStarted","Data":"c8e41db64721802eb9e2d30e33b7feaf3f233822df5127e44d2dee0b5f64ca8a"} Mar 13 12:06:30 crc kubenswrapper[4837]: I0313 12:06:30.789156 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" event={"ID":"306aa5e9-7f77-4ff8-9cf6-5b3255c85337","Type":"ContainerStarted","Data":"2f11bc74222520fcf554ba948fc1d1529fb608acebf234b92a60442a96bc720f"} Mar 13 12:06:30 crc kubenswrapper[4837]: I0313 12:06:30.789466 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:30 crc kubenswrapper[4837]: I0313 12:06:30.853562 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" podStartSLOduration=3.853541599 podStartE2EDuration="3.853541599s" podCreationTimestamp="2026-03-13 12:06:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:30.846156216 +0000 UTC m=+1106.484422999" watchObservedRunningTime="2026-03-13 12:06:30.853541599 +0000 UTC m=+1106.491808362" Mar 13 12:06:31 crc kubenswrapper[4837]: I0313 12:06:31.077499 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64ff6ef1-7035-4f8e-8ee7-d0b858c92459" path="/var/lib/kubelet/pods/64ff6ef1-7035-4f8e-8ee7-d0b858c92459/volumes" Mar 13 12:06:33 crc kubenswrapper[4837]: I0313 12:06:33.821052 4837 generic.go:334] "Generic (PLEG): container finished" podID="3dec188c-ab95-4544-ac61-6f435f830f97" containerID="40a0292d1dfe433f0d44082f12bb7e30ff5d447f6e395b1d7a570420f6252eeb" exitCode=0 Mar 13 12:06:33 crc kubenswrapper[4837]: I0313 12:06:33.821107 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kz7j9" event={"ID":"3dec188c-ab95-4544-ac61-6f435f830f97","Type":"ContainerDied","Data":"40a0292d1dfe433f0d44082f12bb7e30ff5d447f6e395b1d7a570420f6252eeb"} Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.771409 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7f5fbb4dd7-wcbjd"] Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.824651 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5596f9dfb8-m9bxb"] Mar 13 12:06:35 crc kubenswrapper[4837]: E0313 12:06:35.825151 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ff6ef1-7035-4f8e-8ee7-d0b858c92459" containerName="init" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.825167 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ff6ef1-7035-4f8e-8ee7-d0b858c92459" containerName="init" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.825348 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ff6ef1-7035-4f8e-8ee7-d0b858c92459" containerName="init" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.826314 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.833731 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5596f9dfb8-m9bxb"] Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.835210 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.851147 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvsmz\" (UniqueName: \"kubernetes.io/projected/2a28d7a5-22a2-460a-a08c-8eb484e6c382-kube-api-access-wvsmz\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.851220 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a28d7a5-22a2-460a-a08c-8eb484e6c382-config-data\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.851245 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-horizon-secret-key\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.851284 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-horizon-tls-certs\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.851307 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-combined-ca-bundle\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.851326 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a28d7a5-22a2-460a-a08c-8eb484e6c382-logs\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.851352 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a28d7a5-22a2-460a-a08c-8eb484e6c382-scripts\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.899212 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-c6787dc45-zbdfx"] Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.927421 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-fd6ddfd9b-f66l8"] Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.929014 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.964504 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-horizon-secret-key\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.964616 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d3df345-07a2-41bf-aae4-088b3ce83b63-horizon-tls-certs\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.964676 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-horizon-tls-certs\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.964724 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-combined-ca-bundle\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.964754 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a28d7a5-22a2-460a-a08c-8eb484e6c382-logs\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.964825 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfskx\" (UniqueName: \"kubernetes.io/projected/4d3df345-07a2-41bf-aae4-088b3ce83b63-kube-api-access-pfskx\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.964868 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a28d7a5-22a2-460a-a08c-8eb484e6c382-scripts\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.964990 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d3df345-07a2-41bf-aae4-088b3ce83b63-config-data\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.965034 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvsmz\" (UniqueName: \"kubernetes.io/projected/2a28d7a5-22a2-460a-a08c-8eb484e6c382-kube-api-access-wvsmz\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.965076 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d3df345-07a2-41bf-aae4-088b3ce83b63-scripts\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.965109 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d3df345-07a2-41bf-aae4-088b3ce83b63-combined-ca-bundle\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.965140 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4d3df345-07a2-41bf-aae4-088b3ce83b63-horizon-secret-key\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.965220 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d3df345-07a2-41bf-aae4-088b3ce83b63-logs\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.965260 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a28d7a5-22a2-460a-a08c-8eb484e6c382-config-data\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.968057 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a28d7a5-22a2-460a-a08c-8eb484e6c382-logs\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.968245 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a28d7a5-22a2-460a-a08c-8eb484e6c382-scripts\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.968542 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a28d7a5-22a2-460a-a08c-8eb484e6c382-config-data\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.973453 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-horizon-secret-key\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.973810 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-combined-ca-bundle\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.974253 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-horizon-tls-certs\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.974990 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-fd6ddfd9b-f66l8"] Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.010981 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvsmz\" (UniqueName: \"kubernetes.io/projected/2a28d7a5-22a2-460a-a08c-8eb484e6c382-kube-api-access-wvsmz\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.066630 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d3df345-07a2-41bf-aae4-088b3ce83b63-config-data\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.066718 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d3df345-07a2-41bf-aae4-088b3ce83b63-scripts\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.066754 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d3df345-07a2-41bf-aae4-088b3ce83b63-combined-ca-bundle\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.066786 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4d3df345-07a2-41bf-aae4-088b3ce83b63-horizon-secret-key\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.066836 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d3df345-07a2-41bf-aae4-088b3ce83b63-logs\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.066896 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d3df345-07a2-41bf-aae4-088b3ce83b63-horizon-tls-certs\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.066963 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfskx\" (UniqueName: \"kubernetes.io/projected/4d3df345-07a2-41bf-aae4-088b3ce83b63-kube-api-access-pfskx\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.068095 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d3df345-07a2-41bf-aae4-088b3ce83b63-logs\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.068608 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d3df345-07a2-41bf-aae4-088b3ce83b63-scripts\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.069736 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d3df345-07a2-41bf-aae4-088b3ce83b63-config-data\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.074120 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4d3df345-07a2-41bf-aae4-088b3ce83b63-horizon-secret-key\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.075177 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d3df345-07a2-41bf-aae4-088b3ce83b63-combined-ca-bundle\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.081287 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d3df345-07a2-41bf-aae4-088b3ce83b63-horizon-tls-certs\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.096660 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfskx\" (UniqueName: \"kubernetes.io/projected/4d3df345-07a2-41bf-aae4-088b3ce83b63-kube-api-access-pfskx\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.154400 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.249546 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:37 crc kubenswrapper[4837]: I0313 12:06:37.878145 4837 generic.go:334] "Generic (PLEG): container finished" podID="b4490fb3-45d7-4b40-ad34-5bf33ba88491" containerID="483a91e4e8aeb62a4bc9d00fab2fa3f3452e90337b10ae7eb6d6d40d39b495c8" exitCode=0 Mar 13 12:06:37 crc kubenswrapper[4837]: I0313 12:06:37.878560 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jkthw" event={"ID":"b4490fb3-45d7-4b40-ad34-5bf33ba88491","Type":"ContainerDied","Data":"483a91e4e8aeb62a4bc9d00fab2fa3f3452e90337b10ae7eb6d6d40d39b495c8"} Mar 13 12:06:38 crc kubenswrapper[4837]: I0313 12:06:38.168794 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:38 crc kubenswrapper[4837]: I0313 12:06:38.238742 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-5nlfg"] Mar 13 12:06:38 crc kubenswrapper[4837]: I0313 12:06:38.239068 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" podUID="1a847add-da54-4a5d-9bca-5aea455eefe8" containerName="dnsmasq-dns" containerID="cri-o://7fd2e269ac89746bd02c6eb6e013fcc551156a1538f9f4807e06a63dd46236d2" gracePeriod=10 Mar 13 12:06:38 crc kubenswrapper[4837]: I0313 12:06:38.898457 4837 generic.go:334] "Generic (PLEG): container finished" podID="1a847add-da54-4a5d-9bca-5aea455eefe8" containerID="7fd2e269ac89746bd02c6eb6e013fcc551156a1538f9f4807e06a63dd46236d2" exitCode=0 Mar 13 12:06:38 crc kubenswrapper[4837]: I0313 12:06:38.898500 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" event={"ID":"1a847add-da54-4a5d-9bca-5aea455eefe8","Type":"ContainerDied","Data":"7fd2e269ac89746bd02c6eb6e013fcc551156a1538f9f4807e06a63dd46236d2"} Mar 13 12:06:40 crc kubenswrapper[4837]: I0313 12:06:40.153480 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" podUID="1a847add-da54-4a5d-9bca-5aea455eefe8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: connect: connection refused" Mar 13 12:06:45 crc kubenswrapper[4837]: I0313 12:06:45.153738 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" podUID="1a847add-da54-4a5d-9bca-5aea455eefe8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: connect: connection refused" Mar 13 12:06:48 crc kubenswrapper[4837]: E0313 12:06:48.689500 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 13 12:06:48 crc kubenswrapper[4837]: E0313 12:06:48.690146 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n55dh5bfh565h684h667h549h5f4h55bh68fh5d8h8ch57h69h65h694h6dh5d7h9dhf6h57fh548h6chb4h549h57ch5f4h5cbhd5h658h548hf5h56cq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6r7pn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:06:48 crc kubenswrapper[4837]: I0313 12:06:48.985206 4837 generic.go:334] "Generic (PLEG): container finished" podID="d2d0a770-288f-40d8-832e-f5463863bef1" containerID="167d2264a85f4435f333e5de927afa95b020419521d018ef924666fe1959c6ff" exitCode=0 Mar 13 12:06:48 crc kubenswrapper[4837]: I0313 12:06:48.985256 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wdwg2" event={"ID":"d2d0a770-288f-40d8-832e-f5463863bef1","Type":"ContainerDied","Data":"167d2264a85f4435f333e5de927afa95b020419521d018ef924666fe1959c6ff"} Mar 13 12:06:50 crc kubenswrapper[4837]: I0313 12:06:50.154194 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" podUID="1a847add-da54-4a5d-9bca-5aea455eefe8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: connect: connection refused" Mar 13 12:06:50 crc kubenswrapper[4837]: I0313 12:06:50.154570 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:50 crc kubenswrapper[4837]: E0313 12:06:50.295400 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 13 12:06:50 crc kubenswrapper[4837]: E0313 12:06:50.295984 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n687h5b7h54ch5b7h87h5cfh568hf6h75hc6h646hd8hcbh5f6h686hfh576h567h64fhc9h55h677h645h576h568hfbhbbh5d9h567hf6h54hb7q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jr9qj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7f5fbb4dd7-wcbjd_openstack(99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:06:50 crc kubenswrapper[4837]: E0313 12:06:50.298699 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7f5fbb4dd7-wcbjd" podUID="99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2" Mar 13 12:06:51 crc kubenswrapper[4837]: E0313 12:06:51.711108 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Mar 13 12:06:51 crc kubenswrapper[4837]: E0313 12:06:51.711624 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2vtt2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-8vx8g_openstack(08c7b2a5-b0b8-433f-b55d-c64eaeea8b76): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:06:51 crc kubenswrapper[4837]: E0313 12:06:51.712876 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-8vx8g" podUID="08c7b2a5-b0b8-433f-b55d-c64eaeea8b76" Mar 13 12:06:51 crc kubenswrapper[4837]: E0313 12:06:51.726682 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 13 12:06:51 crc kubenswrapper[4837]: E0313 12:06:51.726839 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n57h598hcfh55ch5dbhf7hbh579h57ch687h88hcfh4h67ch7fhc6h5c5h57bh68ch5b4h5dbhch64bh9fh66h666h546h98h646h5f9h5fh5d5q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-scc7s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6b5f9b5c85-p584g_openstack(1f2afb5c-bfb2-4349-8000-4c0c90892d56): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.865784 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.873451 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jkthw" Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.973358 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsj8v\" (UniqueName: \"kubernetes.io/projected/3dec188c-ab95-4544-ac61-6f435f830f97-kube-api-access-gsj8v\") pod \"3dec188c-ab95-4544-ac61-6f435f830f97\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.973439 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-combined-ca-bundle\") pod \"3dec188c-ab95-4544-ac61-6f435f830f97\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.973498 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-credential-keys\") pod \"3dec188c-ab95-4544-ac61-6f435f830f97\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.973519 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-scripts\") pod \"3dec188c-ab95-4544-ac61-6f435f830f97\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.973542 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-db-sync-config-data\") pod \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.973607 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-config-data\") pod \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.973672 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-config-data\") pod \"3dec188c-ab95-4544-ac61-6f435f830f97\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.973700 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-combined-ca-bundle\") pod \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.973732 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-fernet-keys\") pod \"3dec188c-ab95-4544-ac61-6f435f830f97\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.973755 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zh4r\" (UniqueName: \"kubernetes.io/projected/b4490fb3-45d7-4b40-ad34-5bf33ba88491-kube-api-access-5zh4r\") pod \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.979913 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-scripts" (OuterVolumeSpecName: "scripts") pod "3dec188c-ab95-4544-ac61-6f435f830f97" (UID: "3dec188c-ab95-4544-ac61-6f435f830f97"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.980830 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3dec188c-ab95-4544-ac61-6f435f830f97" (UID: "3dec188c-ab95-4544-ac61-6f435f830f97"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.981499 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b4490fb3-45d7-4b40-ad34-5bf33ba88491" (UID: "b4490fb3-45d7-4b40-ad34-5bf33ba88491"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.984703 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4490fb3-45d7-4b40-ad34-5bf33ba88491-kube-api-access-5zh4r" (OuterVolumeSpecName: "kube-api-access-5zh4r") pod "b4490fb3-45d7-4b40-ad34-5bf33ba88491" (UID: "b4490fb3-45d7-4b40-ad34-5bf33ba88491"). InnerVolumeSpecName "kube-api-access-5zh4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.986137 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3dec188c-ab95-4544-ac61-6f435f830f97" (UID: "3dec188c-ab95-4544-ac61-6f435f830f97"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.995878 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dec188c-ab95-4544-ac61-6f435f830f97-kube-api-access-gsj8v" (OuterVolumeSpecName: "kube-api-access-gsj8v") pod "3dec188c-ab95-4544-ac61-6f435f830f97" (UID: "3dec188c-ab95-4544-ac61-6f435f830f97"). InnerVolumeSpecName "kube-api-access-gsj8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.003174 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-config-data" (OuterVolumeSpecName: "config-data") pod "3dec188c-ab95-4544-ac61-6f435f830f97" (UID: "3dec188c-ab95-4544-ac61-6f435f830f97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.008953 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4490fb3-45d7-4b40-ad34-5bf33ba88491" (UID: "b4490fb3-45d7-4b40-ad34-5bf33ba88491"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.011389 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3dec188c-ab95-4544-ac61-6f435f830f97" (UID: "3dec188c-ab95-4544-ac61-6f435f830f97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.017754 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jkthw" event={"ID":"b4490fb3-45d7-4b40-ad34-5bf33ba88491","Type":"ContainerDied","Data":"17d86872aee9655dc63bbe1e8b164cedfec91be43293c0487555e85e1e22c479"} Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.017790 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17d86872aee9655dc63bbe1e8b164cedfec91be43293c0487555e85e1e22c479" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.017846 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jkthw" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.021375 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.023323 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kz7j9" event={"ID":"3dec188c-ab95-4544-ac61-6f435f830f97","Type":"ContainerDied","Data":"a81b99f16a86752e1a0e2bc8e53026978a648af00da7f92ffdb871f715779f44"} Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.023365 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a81b99f16a86752e1a0e2bc8e53026978a648af00da7f92ffdb871f715779f44" Mar 13 12:06:52 crc kubenswrapper[4837]: E0313 12:06:52.023889 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-8vx8g" podUID="08c7b2a5-b0b8-433f-b55d-c64eaeea8b76" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.051388 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-config-data" (OuterVolumeSpecName: "config-data") pod "b4490fb3-45d7-4b40-ad34-5bf33ba88491" (UID: "b4490fb3-45d7-4b40-ad34-5bf33ba88491"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.076163 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.076197 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.076209 4837 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.076218 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zh4r\" (UniqueName: \"kubernetes.io/projected/b4490fb3-45d7-4b40-ad34-5bf33ba88491-kube-api-access-5zh4r\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.076230 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsj8v\" (UniqueName: \"kubernetes.io/projected/3dec188c-ab95-4544-ac61-6f435f830f97-kube-api-access-gsj8v\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.076238 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.076245 4837 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.076252 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.076265 4837 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.076272 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:52 crc kubenswrapper[4837]: E0313 12:06:52.786845 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 13 12:06:52 crc kubenswrapper[4837]: E0313 12:06:52.787367 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4p5k2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-qdzjz_openstack(a44db1d6-6da2-41a5-a37f-ffc602f0d55a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:06:52 crc kubenswrapper[4837]: E0313 12:06:52.788563 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-qdzjz" podUID="a44db1d6-6da2-41a5-a37f-ffc602f0d55a" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.946319 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-kz7j9"] Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.953691 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-kz7j9"] Mar 13 12:06:53 crc kubenswrapper[4837]: E0313 12:06:53.042401 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-qdzjz" podUID="a44db1d6-6da2-41a5-a37f-ffc602f0d55a" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.059850 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dec188c-ab95-4544-ac61-6f435f830f97" path="/var/lib/kubelet/pods/3dec188c-ab95-4544-ac61-6f435f830f97/volumes" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.060382 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-s7m97"] Mar 13 12:06:53 crc kubenswrapper[4837]: E0313 12:06:53.062313 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dec188c-ab95-4544-ac61-6f435f830f97" containerName="keystone-bootstrap" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.062342 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dec188c-ab95-4544-ac61-6f435f830f97" containerName="keystone-bootstrap" Mar 13 12:06:53 crc kubenswrapper[4837]: E0313 12:06:53.062363 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4490fb3-45d7-4b40-ad34-5bf33ba88491" containerName="glance-db-sync" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.062372 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4490fb3-45d7-4b40-ad34-5bf33ba88491" containerName="glance-db-sync" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.062888 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4490fb3-45d7-4b40-ad34-5bf33ba88491" containerName="glance-db-sync" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.062917 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dec188c-ab95-4544-ac61-6f435f830f97" containerName="keystone-bootstrap" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.063507 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-s7m97"] Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.063595 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.073194 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.073542 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-w6mdg" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.073790 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.074020 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.074386 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.198943 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-combined-ca-bundle\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.198992 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-fernet-keys\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.199128 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7vwb\" (UniqueName: \"kubernetes.io/projected/3af4ac68-a437-4be7-adab-1ef336f0cbda-kube-api-access-w7vwb\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.199148 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-credential-keys\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.199161 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-scripts\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.199226 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-config-data\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.266200 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-gcf4g"] Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.270810 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.281745 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-gcf4g"] Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.301535 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-config-data\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.301594 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-combined-ca-bundle\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.301615 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-fernet-keys\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.301703 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7vwb\" (UniqueName: \"kubernetes.io/projected/3af4ac68-a437-4be7-adab-1ef336f0cbda-kube-api-access-w7vwb\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.301726 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-credential-keys\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.301743 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-scripts\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.309128 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-combined-ca-bundle\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.309657 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-credential-keys\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.309865 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-scripts\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.311259 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-config-data\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.321468 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7vwb\" (UniqueName: \"kubernetes.io/projected/3af4ac68-a437-4be7-adab-1ef336f0cbda-kube-api-access-w7vwb\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.337866 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-fernet-keys\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.389355 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.403335 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.403432 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.403452 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkrqw\" (UniqueName: \"kubernetes.io/projected/55b83aaa-c867-44b7-bfea-b43c5bc0e471-kube-api-access-fkrqw\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.403477 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.403496 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-config\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.403609 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.505673 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.506116 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.506144 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkrqw\" (UniqueName: \"kubernetes.io/projected/55b83aaa-c867-44b7-bfea-b43c5bc0e471-kube-api-access-fkrqw\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.506185 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.506212 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-config\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.506332 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.506749 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.507068 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.507146 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.507671 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-config\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.511937 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.530392 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkrqw\" (UniqueName: \"kubernetes.io/projected/55b83aaa-c867-44b7-bfea-b43c5bc0e471-kube-api-access-fkrqw\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.604387 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: E0313 12:06:53.624347 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 13 12:06:53 crc kubenswrapper[4837]: E0313 12:06:53.624502 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9sh7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-b6qnm_openstack(95b808e7-674f-4592-af6e-f7c8682f6a17): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:06:53 crc kubenswrapper[4837]: E0313 12:06:53.625956 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-b6qnm" podUID="95b808e7-674f-4592-af6e-f7c8682f6a17" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.716690 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.735599 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.745077 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.812469 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-config-data\") pod \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.812687 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-horizon-secret-key\") pod \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.812719 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-scripts\") pod \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.812786 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-logs\") pod \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.812823 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr9qj\" (UniqueName: \"kubernetes.io/projected/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-kube-api-access-jr9qj\") pod \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.813351 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-config-data" (OuterVolumeSpecName: "config-data") pod "99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2" (UID: "99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.813627 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-scripts" (OuterVolumeSpecName: "scripts") pod "99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2" (UID: "99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.813855 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-logs" (OuterVolumeSpecName: "logs") pod "99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2" (UID: "99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.817257 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-kube-api-access-jr9qj" (OuterVolumeSpecName: "kube-api-access-jr9qj") pod "99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2" (UID: "99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2"). InnerVolumeSpecName "kube-api-access-jr9qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.820022 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2" (UID: "99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.914825 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2d0a770-288f-40d8-832e-f5463863bef1-combined-ca-bundle\") pod \"d2d0a770-288f-40d8-832e-f5463863bef1\" (UID: \"d2d0a770-288f-40d8-832e-f5463863bef1\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.914987 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-dns-swift-storage-0\") pod \"1a847add-da54-4a5d-9bca-5aea455eefe8\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.915090 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq9wd\" (UniqueName: \"kubernetes.io/projected/d2d0a770-288f-40d8-832e-f5463863bef1-kube-api-access-qq9wd\") pod \"d2d0a770-288f-40d8-832e-f5463863bef1\" (UID: \"d2d0a770-288f-40d8-832e-f5463863bef1\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.915163 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-ovsdbserver-sb\") pod \"1a847add-da54-4a5d-9bca-5aea455eefe8\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.915227 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-config\") pod \"1a847add-da54-4a5d-9bca-5aea455eefe8\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.915289 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-dns-svc\") pod \"1a847add-da54-4a5d-9bca-5aea455eefe8\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.915370 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-ovsdbserver-nb\") pod \"1a847add-da54-4a5d-9bca-5aea455eefe8\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.915459 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d2d0a770-288f-40d8-832e-f5463863bef1-config\") pod \"d2d0a770-288f-40d8-832e-f5463863bef1\" (UID: \"d2d0a770-288f-40d8-832e-f5463863bef1\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.915519 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trr9k\" (UniqueName: \"kubernetes.io/projected/1a847add-da54-4a5d-9bca-5aea455eefe8-kube-api-access-trr9k\") pod \"1a847add-da54-4a5d-9bca-5aea455eefe8\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.916195 4837 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.916238 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.916250 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.916261 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jr9qj\" (UniqueName: \"kubernetes.io/projected/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-kube-api-access-jr9qj\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.916274 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.920597 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2d0a770-288f-40d8-832e-f5463863bef1-kube-api-access-qq9wd" (OuterVolumeSpecName: "kube-api-access-qq9wd") pod "d2d0a770-288f-40d8-832e-f5463863bef1" (UID: "d2d0a770-288f-40d8-832e-f5463863bef1"). InnerVolumeSpecName "kube-api-access-qq9wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.922018 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a847add-da54-4a5d-9bca-5aea455eefe8-kube-api-access-trr9k" (OuterVolumeSpecName: "kube-api-access-trr9k") pod "1a847add-da54-4a5d-9bca-5aea455eefe8" (UID: "1a847add-da54-4a5d-9bca-5aea455eefe8"). InnerVolumeSpecName "kube-api-access-trr9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.939731 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2d0a770-288f-40d8-832e-f5463863bef1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2d0a770-288f-40d8-832e-f5463863bef1" (UID: "d2d0a770-288f-40d8-832e-f5463863bef1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.943120 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2d0a770-288f-40d8-832e-f5463863bef1-config" (OuterVolumeSpecName: "config") pod "d2d0a770-288f-40d8-832e-f5463863bef1" (UID: "d2d0a770-288f-40d8-832e-f5463863bef1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.962049 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1a847add-da54-4a5d-9bca-5aea455eefe8" (UID: "1a847add-da54-4a5d-9bca-5aea455eefe8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.963679 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1a847add-da54-4a5d-9bca-5aea455eefe8" (UID: "1a847add-da54-4a5d-9bca-5aea455eefe8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.967662 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1a847add-da54-4a5d-9bca-5aea455eefe8" (UID: "1a847add-da54-4a5d-9bca-5aea455eefe8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.973160 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-config" (OuterVolumeSpecName: "config") pod "1a847add-da54-4a5d-9bca-5aea455eefe8" (UID: "1a847add-da54-4a5d-9bca-5aea455eefe8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.977702 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1a847add-da54-4a5d-9bca-5aea455eefe8" (UID: "1a847add-da54-4a5d-9bca-5aea455eefe8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.018549 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq9wd\" (UniqueName: \"kubernetes.io/projected/d2d0a770-288f-40d8-832e-f5463863bef1-kube-api-access-qq9wd\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.018590 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.018600 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.018627 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.018659 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.018668 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d2d0a770-288f-40d8-832e-f5463863bef1-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.018677 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trr9k\" (UniqueName: \"kubernetes.io/projected/1a847add-da54-4a5d-9bca-5aea455eefe8-kube-api-access-trr9k\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.018703 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2d0a770-288f-40d8-832e-f5463863bef1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.018713 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.105941 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wdwg2" event={"ID":"d2d0a770-288f-40d8-832e-f5463863bef1","Type":"ContainerDied","Data":"20eddadf1412bdac7244116ec35325dbc4b45413968aa761e6fe806d93d5742c"} Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.106176 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20eddadf1412bdac7244116ec35325dbc4b45413968aa761e6fe806d93d5742c" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.106254 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.117477 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" event={"ID":"1a847add-da54-4a5d-9bca-5aea455eefe8","Type":"ContainerDied","Data":"2cbead7100d7df29ad960b80cb3c7ee5eb871cec6fea242940565dd0d3726566"} Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.117535 4837 scope.go:117] "RemoveContainer" containerID="7fd2e269ac89746bd02c6eb6e013fcc551156a1538f9f4807e06a63dd46236d2" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.117603 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.123994 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.124328 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f5fbb4dd7-wcbjd" event={"ID":"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2","Type":"ContainerDied","Data":"041964cbbb19e51e7b1a85074982a092d132a78534f19eb3616f99ddc8aa3e15"} Mar 13 12:06:54 crc kubenswrapper[4837]: E0313 12:06:54.127259 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-b6qnm" podUID="95b808e7-674f-4592-af6e-f7c8682f6a17" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.159195 4837 scope.go:117] "RemoveContainer" containerID="88f5a9c016c890932c1524d02aeb53601bb1a2cc77b41ca9cf3fabeb2713f8a0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.192531 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:06:54 crc kubenswrapper[4837]: E0313 12:06:54.193041 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a847add-da54-4a5d-9bca-5aea455eefe8" containerName="init" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.193067 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a847add-da54-4a5d-9bca-5aea455eefe8" containerName="init" Mar 13 12:06:54 crc kubenswrapper[4837]: E0313 12:06:54.193095 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d0a770-288f-40d8-832e-f5463863bef1" containerName="neutron-db-sync" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.193103 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d0a770-288f-40d8-832e-f5463863bef1" containerName="neutron-db-sync" Mar 13 12:06:54 crc kubenswrapper[4837]: E0313 12:06:54.193124 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a847add-da54-4a5d-9bca-5aea455eefe8" containerName="dnsmasq-dns" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.193132 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a847add-da54-4a5d-9bca-5aea455eefe8" containerName="dnsmasq-dns" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.193319 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a847add-da54-4a5d-9bca-5aea455eefe8" containerName="dnsmasq-dns" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.193337 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2d0a770-288f-40d8-832e-f5463863bef1" containerName="neutron-db-sync" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.194410 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.196357 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.197448 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-dvhzm" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.197909 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.218266 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.323744 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-scripts\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.323798 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.323822 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.323900 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-config-data\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.323914 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.323932 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-logs\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.323956 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkskj\" (UniqueName: \"kubernetes.io/projected/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-kube-api-access-vkskj\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.326338 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-5nlfg"] Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.335412 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-5nlfg"] Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.405306 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.406757 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.410556 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.427710 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-scripts\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.427772 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.427802 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.427896 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-config-data\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.427922 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.427945 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-logs\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.427979 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkskj\" (UniqueName: \"kubernetes.io/projected/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-kube-api-access-vkskj\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.428443 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.428715 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.428811 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-logs\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.432091 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-scripts\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.432406 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-config-data\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.433388 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.435787 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.481181 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkskj\" (UniqueName: \"kubernetes.io/projected/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-kube-api-access-vkskj\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.489328 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.530042 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-config-data\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.530096 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zz8b\" (UniqueName: \"kubernetes.io/projected/643b18f8-6c85-43ec-977a-c9eade4db120-kube-api-access-9zz8b\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.530122 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/643b18f8-6c85-43ec-977a-c9eade4db120-logs\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.530355 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-scripts\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.530412 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/643b18f8-6c85-43ec-977a-c9eade4db120-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.530525 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.530599 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.561499 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-fd6ddfd9b-f66l8"] Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.608195 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.632384 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.632435 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.632508 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-config-data\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.632529 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zz8b\" (UniqueName: \"kubernetes.io/projected/643b18f8-6c85-43ec-977a-c9eade4db120-kube-api-access-9zz8b\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.632545 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/643b18f8-6c85-43ec-977a-c9eade4db120-logs\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.632593 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-scripts\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.632608 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/643b18f8-6c85-43ec-977a-c9eade4db120-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.633187 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/643b18f8-6c85-43ec-977a-c9eade4db120-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.634142 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.637885 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-scripts\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.638087 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/643b18f8-6c85-43ec-977a-c9eade4db120-logs\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.644839 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-config-data\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.646052 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.660976 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zz8b\" (UniqueName: \"kubernetes.io/projected/643b18f8-6c85-43ec-977a-c9eade4db120-kube-api-access-9zz8b\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.695741 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7f5fbb4dd7-wcbjd"] Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.702022 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7f5fbb4dd7-wcbjd"] Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.709933 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: E0313 12:06:54.742681 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/horizon-6b5f9b5c85-p584g" podUID="1f2afb5c-bfb2-4349-8000-4c0c90892d56" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.747151 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-gcf4g"] Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.753606 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-s7m97"] Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.763524 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5596f9dfb8-m9bxb"] Mar 13 12:06:54 crc kubenswrapper[4837]: W0313 12:06:54.801169 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55b83aaa_c867_44b7_bfea_b43c5bc0e471.slice/crio-b4d001249d11cd6dc828e98e85e66d69123eb5994ed8271394ef31c5c5efbf84 WatchSource:0}: Error finding container b4d001249d11cd6dc828e98e85e66d69123eb5994ed8271394ef31c5c5efbf84: Status 404 returned error can't find the container with id b4d001249d11cd6dc828e98e85e66d69123eb5994ed8271394ef31c5c5efbf84 Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.956913 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.988148 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-gcf4g"] Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.023032 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t9gtj"] Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.027297 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.083523 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a847add-da54-4a5d-9bca-5aea455eefe8" path="/var/lib/kubelet/pods/1a847add-da54-4a5d-9bca-5aea455eefe8/volumes" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.088431 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2" path="/var/lib/kubelet/pods/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2/volumes" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.092396 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t9gtj"] Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.144318 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r558r\" (UniqueName: \"kubernetes.io/projected/e06de12b-6071-4dce-81f1-68539347ca19-kube-api-access-r558r\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.144363 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.144406 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-config\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.149314 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.149489 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.149568 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-dns-svc\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.169897 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fd6ddfd9b-f66l8" event={"ID":"4d3df345-07a2-41bf-aae4-088b3ce83b63","Type":"ContainerStarted","Data":"04db307b8f39b1892804fc5ba2b134e48dc2f160a0dcc574789fbe9774cfb760"} Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.170135 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fd6ddfd9b-f66l8" event={"ID":"4d3df345-07a2-41bf-aae4-088b3ce83b63","Type":"ContainerStarted","Data":"5132c4458264f549762024a8619ce82c80b95d512d63406b0390b55fa902c696"} Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.174651 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee","Type":"ContainerStarted","Data":"9c2af154abb9a37a270c00c3cc335b4994ab6bb24ddaf80f1f5bfc313a6b9fb6"} Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.175942 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b5f9b5c85-p584g" event={"ID":"1f2afb5c-bfb2-4349-8000-4c0c90892d56","Type":"ContainerStarted","Data":"ccf4fdc9606b0ae8a6ecc82badd31da8c6fddc1f4294bee13d5805f8da627b43"} Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.176153 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6b5f9b5c85-p584g" podUID="1f2afb5c-bfb2-4349-8000-4c0c90892d56" containerName="horizon" containerID="cri-o://ccf4fdc9606b0ae8a6ecc82badd31da8c6fddc1f4294bee13d5805f8da627b43" gracePeriod=30 Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.195246 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-c6787dc45-zbdfx" podUID="5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" containerName="horizon-log" containerID="cri-o://b196a9394882e394baaf7222251dcba129911dfdfe911b4d1d679d89adbed206" gracePeriod=30 Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.195479 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-c6787dc45-zbdfx" podUID="5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" containerName="horizon" containerID="cri-o://7f77bd5a27791856de608c8a08f3c83e1663f61407b889e9328671983bac96ca" gracePeriod=30 Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.195553 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c6787dc45-zbdfx" event={"ID":"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14","Type":"ContainerStarted","Data":"7f77bd5a27791856de608c8a08f3c83e1663f61407b889e9328671983bac96ca"} Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.195573 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c6787dc45-zbdfx" event={"ID":"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14","Type":"ContainerStarted","Data":"b196a9394882e394baaf7222251dcba129911dfdfe911b4d1d679d89adbed206"} Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.202036 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5596f9dfb8-m9bxb" event={"ID":"2a28d7a5-22a2-460a-a08c-8eb484e6c382","Type":"ContainerStarted","Data":"60985fc2aa747df3481773b902df6591e8f7e0a9aaa937b1d8ccf7c3a2e33f6e"} Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.203280 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" event={"ID":"55b83aaa-c867-44b7-bfea-b43c5bc0e471","Type":"ContainerStarted","Data":"b4d001249d11cd6dc828e98e85e66d69123eb5994ed8271394ef31c5c5efbf84"} Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.214739 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7m97" event={"ID":"3af4ac68-a437-4be7-adab-1ef336f0cbda","Type":"ContainerStarted","Data":"2d322ad3eeeb347ecc17c10b7e12064f45bbd098c57202ba37c2350f75cdbf0c"} Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.214784 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7m97" event={"ID":"3af4ac68-a437-4be7-adab-1ef336f0cbda","Type":"ContainerStarted","Data":"a67502d2e54c1c7cec9684d1027629e2f2e584a89eba555b6f508bab8dc6003d"} Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.218145 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-67f9f46cf4-9cvcg"] Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.219583 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.225658 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.225982 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-88ssc" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.226161 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.226291 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.259715 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-config\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.259841 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-config\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.259870 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-ovndb-tls-certs\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.259908 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.260014 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-combined-ca-bundle\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.260057 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.260096 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s56d9\" (UniqueName: \"kubernetes.io/projected/073acab9-3b9b-432a-aef7-b59bad9fa6ea-kube-api-access-s56d9\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.260138 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-dns-svc\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.260182 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r558r\" (UniqueName: \"kubernetes.io/projected/e06de12b-6071-4dce-81f1-68539347ca19-kube-api-access-r558r\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.260216 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.260247 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-httpd-config\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.264209 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.264731 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.264952 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.265289 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-config\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.268055 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67f9f46cf4-9cvcg"] Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.268712 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-dns-svc\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.283979 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-c6787dc45-zbdfx" podStartSLOduration=2.668016912 podStartE2EDuration="26.283964199s" podCreationTimestamp="2026-03-13 12:06:29 +0000 UTC" firstStartedPulling="2026-03-13 12:06:29.991444589 +0000 UTC m=+1105.629711352" lastFinishedPulling="2026-03-13 12:06:53.607391876 +0000 UTC m=+1129.245658639" observedRunningTime="2026-03-13 12:06:55.235507516 +0000 UTC m=+1130.873774269" watchObservedRunningTime="2026-03-13 12:06:55.283964199 +0000 UTC m=+1130.922230952" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.297065 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r558r\" (UniqueName: \"kubernetes.io/projected/e06de12b-6071-4dce-81f1-68539347ca19-kube-api-access-r558r\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.342786 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.345869 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-s7m97" podStartSLOduration=2.3458500239999998 podStartE2EDuration="2.345850024s" podCreationTimestamp="2026-03-13 12:06:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:55.314098616 +0000 UTC m=+1130.952365379" watchObservedRunningTime="2026-03-13 12:06:55.345850024 +0000 UTC m=+1130.984116787" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.364939 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-config\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.364979 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-ovndb-tls-certs\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.365045 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-combined-ca-bundle\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.365094 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s56d9\" (UniqueName: \"kubernetes.io/projected/073acab9-3b9b-432a-aef7-b59bad9fa6ea-kube-api-access-s56d9\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.365239 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-httpd-config\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.372625 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-config\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.373713 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-combined-ca-bundle\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.373747 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-ovndb-tls-certs\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.385598 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-httpd-config\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.408346 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s56d9\" (UniqueName: \"kubernetes.io/projected/073acab9-3b9b-432a-aef7-b59bad9fa6ea-kube-api-access-s56d9\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.443858 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.611654 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.790145 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.246433 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t9gtj"] Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.304017 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5596f9dfb8-m9bxb" event={"ID":"2a28d7a5-22a2-460a-a08c-8eb484e6c382","Type":"ContainerStarted","Data":"7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb"} Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.304059 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5596f9dfb8-m9bxb" event={"ID":"2a28d7a5-22a2-460a-a08c-8eb484e6c382","Type":"ContainerStarted","Data":"92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654"} Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.308018 4837 generic.go:334] "Generic (PLEG): container finished" podID="55b83aaa-c867-44b7-bfea-b43c5bc0e471" containerID="2783265a0648c64c9cf8018b7e58d013d90c350f449426212bca3f25289ba423" exitCode=0 Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.308080 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" event={"ID":"55b83aaa-c867-44b7-bfea-b43c5bc0e471","Type":"ContainerDied","Data":"2783265a0648c64c9cf8018b7e58d013d90c350f449426212bca3f25289ba423"} Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.313349 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"643b18f8-6c85-43ec-977a-c9eade4db120","Type":"ContainerStarted","Data":"bd322992210798258d691847d1f616bdd9cfed212e86bf6a55ea38cd76ee2987"} Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.317252 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fd6ddfd9b-f66l8" event={"ID":"4d3df345-07a2-41bf-aae4-088b3ce83b63","Type":"ContainerStarted","Data":"ea3e72ba663f65975cf0c5f22cbf3d6fabe3df67dd048e7632a78b0056e1eada"} Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.346547 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5596f9dfb8-m9bxb" podStartSLOduration=21.346525239 podStartE2EDuration="21.346525239s" podCreationTimestamp="2026-03-13 12:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:56.337145813 +0000 UTC m=+1131.975412576" watchObservedRunningTime="2026-03-13 12:06:56.346525239 +0000 UTC m=+1131.984792002" Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.376384 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1","Type":"ContainerStarted","Data":"c6ddb54236268786175c021b9359fcf4e8bd417495e2748483768742d3e54d9e"} Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.387809 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67f9f46cf4-9cvcg"] Mar 13 12:06:56 crc kubenswrapper[4837]: W0313 12:06:56.405796 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod073acab9_3b9b_432a_aef7_b59bad9fa6ea.slice/crio-f8cb990fe37777f793f0250c14cdfa0b903194e81a71cae32c4012805c32b7c7 WatchSource:0}: Error finding container f8cb990fe37777f793f0250c14cdfa0b903194e81a71cae32c4012805c32b7c7: Status 404 returned error can't find the container with id f8cb990fe37777f793f0250c14cdfa0b903194e81a71cae32c4012805c32b7c7 Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.436697 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-fd6ddfd9b-f66l8" podStartSLOduration=21.436677441 podStartE2EDuration="21.436677441s" podCreationTimestamp="2026-03-13 12:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:56.407427722 +0000 UTC m=+1132.045694495" watchObservedRunningTime="2026-03-13 12:06:56.436677441 +0000 UTC m=+1132.074944204" Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.784603 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.935453 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-ovsdbserver-sb\") pod \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.935616 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-dns-svc\") pod \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.935669 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkrqw\" (UniqueName: \"kubernetes.io/projected/55b83aaa-c867-44b7-bfea-b43c5bc0e471-kube-api-access-fkrqw\") pod \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.935699 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-config\") pod \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.935779 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-dns-swift-storage-0\") pod \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.936002 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-ovsdbserver-nb\") pod \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.973513 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55b83aaa-c867-44b7-bfea-b43c5bc0e471-kube-api-access-fkrqw" (OuterVolumeSpecName: "kube-api-access-fkrqw") pod "55b83aaa-c867-44b7-bfea-b43c5bc0e471" (UID: "55b83aaa-c867-44b7-bfea-b43c5bc0e471"). InnerVolumeSpecName "kube-api-access-fkrqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.978414 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.041195 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkrqw\" (UniqueName: \"kubernetes.io/projected/55b83aaa-c867-44b7-bfea-b43c5bc0e471-kube-api-access-fkrqw\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.173195 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.295226 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-config" (OuterVolumeSpecName: "config") pod "55b83aaa-c867-44b7-bfea-b43c5bc0e471" (UID: "55b83aaa-c867-44b7-bfea-b43c5bc0e471"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.296922 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "55b83aaa-c867-44b7-bfea-b43c5bc0e471" (UID: "55b83aaa-c867-44b7-bfea-b43c5bc0e471"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.325249 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "55b83aaa-c867-44b7-bfea-b43c5bc0e471" (UID: "55b83aaa-c867-44b7-bfea-b43c5bc0e471"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.354861 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.354910 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.354925 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.399869 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1","Type":"ContainerStarted","Data":"cdf4673c99faf73809ec550d38a150bd081436099afc9a2124cbc76da3873ab7"} Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.414146 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "55b83aaa-c867-44b7-bfea-b43c5bc0e471" (UID: "55b83aaa-c867-44b7-bfea-b43c5bc0e471"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.414565 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "55b83aaa-c867-44b7-bfea-b43c5bc0e471" (UID: "55b83aaa-c867-44b7-bfea-b43c5bc0e471"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.442664 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.442654 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" event={"ID":"55b83aaa-c867-44b7-bfea-b43c5bc0e471","Type":"ContainerDied","Data":"b4d001249d11cd6dc828e98e85e66d69123eb5994ed8271394ef31c5c5efbf84"} Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.443117 4837 scope.go:117] "RemoveContainer" containerID="2783265a0648c64c9cf8018b7e58d013d90c350f449426212bca3f25289ba423" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.454211 4837 generic.go:334] "Generic (PLEG): container finished" podID="e06de12b-6071-4dce-81f1-68539347ca19" containerID="059e96d08021e09252961e7fbfcfd1f264e2e10514bec4760c10d5076b6990a3" exitCode=0 Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.454288 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" event={"ID":"e06de12b-6071-4dce-81f1-68539347ca19","Type":"ContainerDied","Data":"059e96d08021e09252961e7fbfcfd1f264e2e10514bec4760c10d5076b6990a3"} Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.454311 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" event={"ID":"e06de12b-6071-4dce-81f1-68539347ca19","Type":"ContainerStarted","Data":"bd08737ad8dd4994dc887a5676bf16b9103265c49a66d4c535944bbd694008c2"} Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.456382 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.456408 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.468661 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67f9f46cf4-9cvcg" event={"ID":"073acab9-3b9b-432a-aef7-b59bad9fa6ea","Type":"ContainerStarted","Data":"592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458"} Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.468697 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67f9f46cf4-9cvcg" event={"ID":"073acab9-3b9b-432a-aef7-b59bad9fa6ea","Type":"ContainerStarted","Data":"f8cb990fe37777f793f0250c14cdfa0b903194e81a71cae32c4012805c32b7c7"} Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.655512 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.698701 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-gcf4g"] Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.725343 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-gcf4g"] Mar 13 12:06:58 crc kubenswrapper[4837]: I0313 12:06:58.500111 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" event={"ID":"e06de12b-6071-4dce-81f1-68539347ca19","Type":"ContainerStarted","Data":"3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764"} Mar 13 12:06:58 crc kubenswrapper[4837]: I0313 12:06:58.500762 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:58 crc kubenswrapper[4837]: I0313 12:06:58.512259 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67f9f46cf4-9cvcg" event={"ID":"073acab9-3b9b-432a-aef7-b59bad9fa6ea","Type":"ContainerStarted","Data":"5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f"} Mar 13 12:06:58 crc kubenswrapper[4837]: I0313 12:06:58.513521 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:58 crc kubenswrapper[4837]: I0313 12:06:58.531839 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" podStartSLOduration=4.531814667 podStartE2EDuration="4.531814667s" podCreationTimestamp="2026-03-13 12:06:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:58.523498686 +0000 UTC m=+1134.161765449" watchObservedRunningTime="2026-03-13 12:06:58.531814667 +0000 UTC m=+1134.170081440" Mar 13 12:06:58 crc kubenswrapper[4837]: I0313 12:06:58.533080 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1","Type":"ContainerStarted","Data":"61a252e0f8f14e9b64e09ce9249c12af6d208dea329e7deb38c30d99c07d2a83"} Mar 13 12:06:58 crc kubenswrapper[4837]: I0313 12:06:58.533166 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" containerName="glance-log" containerID="cri-o://cdf4673c99faf73809ec550d38a150bd081436099afc9a2124cbc76da3873ab7" gracePeriod=30 Mar 13 12:06:58 crc kubenswrapper[4837]: I0313 12:06:58.533257 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" containerName="glance-httpd" containerID="cri-o://61a252e0f8f14e9b64e09ce9249c12af6d208dea329e7deb38c30d99c07d2a83" gracePeriod=30 Mar 13 12:06:58 crc kubenswrapper[4837]: I0313 12:06:58.577351 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"643b18f8-6c85-43ec-977a-c9eade4db120","Type":"ContainerStarted","Data":"45488696f1bb86187ce327f2ad0012c25e169de59468b4dda6460464fc5497ce"} Mar 13 12:06:58 crc kubenswrapper[4837]: I0313 12:06:58.580751 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-67f9f46cf4-9cvcg" podStartSLOduration=3.5774781620000002 podStartE2EDuration="3.577478162s" podCreationTimestamp="2026-03-13 12:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:58.546507589 +0000 UTC m=+1134.184774352" watchObservedRunningTime="2026-03-13 12:06:58.577478162 +0000 UTC m=+1134.215744925" Mar 13 12:06:58 crc kubenswrapper[4837]: I0313 12:06:58.590332 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.590308375 podStartE2EDuration="5.590308375s" podCreationTimestamp="2026-03-13 12:06:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:58.571753442 +0000 UTC m=+1134.210020205" watchObservedRunningTime="2026-03-13 12:06:58.590308375 +0000 UTC m=+1134.228575128" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.062052 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55b83aaa-c867-44b7-bfea-b43c5bc0e471" path="/var/lib/kubelet/pods/55b83aaa-c867-44b7-bfea-b43c5bc0e471/volumes" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.277489 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c5479d889-t9mnp"] Mar 13 12:06:59 crc kubenswrapper[4837]: E0313 12:06:59.277912 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55b83aaa-c867-44b7-bfea-b43c5bc0e471" containerName="init" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.277929 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="55b83aaa-c867-44b7-bfea-b43c5bc0e471" containerName="init" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.278090 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="55b83aaa-c867-44b7-bfea-b43c5bc0e471" containerName="init" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.278995 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.281290 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.281515 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.298579 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c5479d889-t9mnp"] Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.427894 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-combined-ca-bundle\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.427951 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-ovndb-tls-certs\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.427985 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vh2b\" (UniqueName: \"kubernetes.io/projected/0faefca0-6038-4bdf-856e-b7cb5b6c5536-kube-api-access-4vh2b\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.428100 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-httpd-config\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.428198 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-public-tls-certs\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.428267 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-internal-tls-certs\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.428307 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-config\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.441967 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.529740 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-combined-ca-bundle\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.529804 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-ovndb-tls-certs\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.529836 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vh2b\" (UniqueName: \"kubernetes.io/projected/0faefca0-6038-4bdf-856e-b7cb5b6c5536-kube-api-access-4vh2b\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.529869 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-httpd-config\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.529902 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-public-tls-certs\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.529929 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-internal-tls-certs\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.529948 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-config\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.539459 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-httpd-config\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.540981 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-internal-tls-certs\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.590976 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-ovndb-tls-certs\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.592586 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-config\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.597797 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vh2b\" (UniqueName: \"kubernetes.io/projected/0faefca0-6038-4bdf-856e-b7cb5b6c5536-kube-api-access-4vh2b\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.600315 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-public-tls-certs\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.604308 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-combined-ca-bundle\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.668354 4837 generic.go:334] "Generic (PLEG): container finished" podID="9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" containerID="61a252e0f8f14e9b64e09ce9249c12af6d208dea329e7deb38c30d99c07d2a83" exitCode=143 Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.668400 4837 generic.go:334] "Generic (PLEG): container finished" podID="9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" containerID="cdf4673c99faf73809ec550d38a150bd081436099afc9a2124cbc76da3873ab7" exitCode=143 Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.668781 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1","Type":"ContainerDied","Data":"61a252e0f8f14e9b64e09ce9249c12af6d208dea329e7deb38c30d99c07d2a83"} Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.668889 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1","Type":"ContainerDied","Data":"cdf4673c99faf73809ec550d38a150bd081436099afc9a2124cbc76da3873ab7"} Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.686113 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"643b18f8-6c85-43ec-977a-c9eade4db120","Type":"ContainerStarted","Data":"483de9baf8ab7c417841c02e737142f145dc9ff207f9fb3cb2947353d68dc2a1"} Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.686362 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="643b18f8-6c85-43ec-977a-c9eade4db120" containerName="glance-log" containerID="cri-o://45488696f1bb86187ce327f2ad0012c25e169de59468b4dda6460464fc5497ce" gracePeriod=30 Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.686539 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="643b18f8-6c85-43ec-977a-c9eade4db120" containerName="glance-httpd" containerID="cri-o://483de9baf8ab7c417841c02e737142f145dc9ff207f9fb3cb2947353d68dc2a1" gracePeriod=30 Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.715472 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.715452621 podStartE2EDuration="6.715452621s" podCreationTimestamp="2026-03-13 12:06:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:59.708196093 +0000 UTC m=+1135.346462856" watchObservedRunningTime="2026-03-13 12:06:59.715452621 +0000 UTC m=+1135.353719384" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.900985 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.026658 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.167592 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-scripts\") pod \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.167716 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-httpd-run\") pod \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.167782 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-logs\") pod \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.167813 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-combined-ca-bundle\") pod \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.167848 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.167945 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-config-data\") pod \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.167987 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkskj\" (UniqueName: \"kubernetes.io/projected/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-kube-api-access-vkskj\") pod \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.170554 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" (UID: "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.171218 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-logs" (OuterVolumeSpecName: "logs") pod "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" (UID: "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.180777 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-scripts" (OuterVolumeSpecName: "scripts") pod "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" (UID: "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.180848 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-kube-api-access-vkskj" (OuterVolumeSpecName: "kube-api-access-vkskj") pod "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" (UID: "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1"). InnerVolumeSpecName "kube-api-access-vkskj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.190167 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" (UID: "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.213745 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" (UID: "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.228094 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-config-data" (OuterVolumeSpecName: "config-data") pod "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" (UID: "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.269951 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.269985 4837 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.269996 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.270007 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.270035 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.270044 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.270058 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkskj\" (UniqueName: \"kubernetes.io/projected/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-kube-api-access-vkskj\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.289139 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.377738 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.635800 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c5479d889-t9mnp"] Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.704054 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1","Type":"ContainerDied","Data":"c6ddb54236268786175c021b9359fcf4e8bd417495e2748483768742d3e54d9e"} Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.704063 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.704127 4837 scope.go:117] "RemoveContainer" containerID="61a252e0f8f14e9b64e09ce9249c12af6d208dea329e7deb38c30d99c07d2a83" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.707303 4837 generic.go:334] "Generic (PLEG): container finished" podID="643b18f8-6c85-43ec-977a-c9eade4db120" containerID="483de9baf8ab7c417841c02e737142f145dc9ff207f9fb3cb2947353d68dc2a1" exitCode=0 Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.707334 4837 generic.go:334] "Generic (PLEG): container finished" podID="643b18f8-6c85-43ec-977a-c9eade4db120" containerID="45488696f1bb86187ce327f2ad0012c25e169de59468b4dda6460464fc5497ce" exitCode=143 Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.708451 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"643b18f8-6c85-43ec-977a-c9eade4db120","Type":"ContainerDied","Data":"483de9baf8ab7c417841c02e737142f145dc9ff207f9fb3cb2947353d68dc2a1"} Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.708488 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"643b18f8-6c85-43ec-977a-c9eade4db120","Type":"ContainerDied","Data":"45488696f1bb86187ce327f2ad0012c25e169de59468b4dda6460464fc5497ce"} Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.768473 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.813362 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.828059 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:07:00 crc kubenswrapper[4837]: E0313 12:07:00.828456 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" containerName="glance-log" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.828475 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" containerName="glance-log" Mar 13 12:07:00 crc kubenswrapper[4837]: E0313 12:07:00.828498 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" containerName="glance-httpd" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.828506 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" containerName="glance-httpd" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.828777 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" containerName="glance-log" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.828807 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" containerName="glance-httpd" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.829765 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.835938 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.836164 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.848026 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.001727 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.001767 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvzd7\" (UniqueName: \"kubernetes.io/projected/f0173ba9-535a-435d-bc51-75c069e69e46-kube-api-access-gvzd7\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.001800 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.001840 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.001901 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-scripts\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.001926 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-config-data\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.001944 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0173ba9-535a-435d-bc51-75c069e69e46-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.001958 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0173ba9-535a-435d-bc51-75c069e69e46-logs\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.080430 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" path="/var/lib/kubelet/pods/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1/volumes" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.105170 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.105216 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvzd7\" (UniqueName: \"kubernetes.io/projected/f0173ba9-535a-435d-bc51-75c069e69e46-kube-api-access-gvzd7\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.105248 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.105280 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.105331 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-scripts\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.105353 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-config-data\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.105377 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0173ba9-535a-435d-bc51-75c069e69e46-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.105404 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0173ba9-535a-435d-bc51-75c069e69e46-logs\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.106304 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.106913 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0173ba9-535a-435d-bc51-75c069e69e46-logs\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.110922 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0173ba9-535a-435d-bc51-75c069e69e46-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.116814 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-scripts\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.117824 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-config-data\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.118690 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.121245 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.133102 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvzd7\" (UniqueName: \"kubernetes.io/projected/f0173ba9-535a-435d-bc51-75c069e69e46-kube-api-access-gvzd7\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.151875 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.456357 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.722082 4837 generic.go:334] "Generic (PLEG): container finished" podID="3af4ac68-a437-4be7-adab-1ef336f0cbda" containerID="2d322ad3eeeb347ecc17c10b7e12064f45bbd098c57202ba37c2350f75cdbf0c" exitCode=0 Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.722124 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7m97" event={"ID":"3af4ac68-a437-4be7-adab-1ef336f0cbda","Type":"ContainerDied","Data":"2d322ad3eeeb347ecc17c10b7e12064f45bbd098c57202ba37c2350f75cdbf0c"} Mar 13 12:07:04 crc kubenswrapper[4837]: I0313 12:07:04.918047 4837 scope.go:117] "RemoveContainer" containerID="cdf4673c99faf73809ec550d38a150bd081436099afc9a2124cbc76da3873ab7" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.136992 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.142358 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.299463 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"643b18f8-6c85-43ec-977a-c9eade4db120\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.304599 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7vwb\" (UniqueName: \"kubernetes.io/projected/3af4ac68-a437-4be7-adab-1ef336f0cbda-kube-api-access-w7vwb\") pod \"3af4ac68-a437-4be7-adab-1ef336f0cbda\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.304939 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-config-data\") pod \"643b18f8-6c85-43ec-977a-c9eade4db120\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.305091 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-config-data\") pod \"3af4ac68-a437-4be7-adab-1ef336f0cbda\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.305190 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zz8b\" (UniqueName: \"kubernetes.io/projected/643b18f8-6c85-43ec-977a-c9eade4db120-kube-api-access-9zz8b\") pod \"643b18f8-6c85-43ec-977a-c9eade4db120\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.305295 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-fernet-keys\") pod \"3af4ac68-a437-4be7-adab-1ef336f0cbda\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.305441 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-scripts\") pod \"3af4ac68-a437-4be7-adab-1ef336f0cbda\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.305569 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-combined-ca-bundle\") pod \"3af4ac68-a437-4be7-adab-1ef336f0cbda\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.305713 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-combined-ca-bundle\") pod \"643b18f8-6c85-43ec-977a-c9eade4db120\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.305819 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/643b18f8-6c85-43ec-977a-c9eade4db120-httpd-run\") pod \"643b18f8-6c85-43ec-977a-c9eade4db120\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.305938 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/643b18f8-6c85-43ec-977a-c9eade4db120-logs\") pod \"643b18f8-6c85-43ec-977a-c9eade4db120\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.306500 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-credential-keys\") pod \"3af4ac68-a437-4be7-adab-1ef336f0cbda\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.306616 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-scripts\") pod \"643b18f8-6c85-43ec-977a-c9eade4db120\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.307689 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "643b18f8-6c85-43ec-977a-c9eade4db120" (UID: "643b18f8-6c85-43ec-977a-c9eade4db120"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.308003 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/643b18f8-6c85-43ec-977a-c9eade4db120-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "643b18f8-6c85-43ec-977a-c9eade4db120" (UID: "643b18f8-6c85-43ec-977a-c9eade4db120"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.312283 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/643b18f8-6c85-43ec-977a-c9eade4db120-logs" (OuterVolumeSpecName: "logs") pod "643b18f8-6c85-43ec-977a-c9eade4db120" (UID: "643b18f8-6c85-43ec-977a-c9eade4db120"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.321985 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/643b18f8-6c85-43ec-977a-c9eade4db120-kube-api-access-9zz8b" (OuterVolumeSpecName: "kube-api-access-9zz8b") pod "643b18f8-6c85-43ec-977a-c9eade4db120" (UID: "643b18f8-6c85-43ec-977a-c9eade4db120"). InnerVolumeSpecName "kube-api-access-9zz8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.326416 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3af4ac68-a437-4be7-adab-1ef336f0cbda-kube-api-access-w7vwb" (OuterVolumeSpecName: "kube-api-access-w7vwb") pod "3af4ac68-a437-4be7-adab-1ef336f0cbda" (UID: "3af4ac68-a437-4be7-adab-1ef336f0cbda"). InnerVolumeSpecName "kube-api-access-w7vwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.328926 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-scripts" (OuterVolumeSpecName: "scripts") pod "3af4ac68-a437-4be7-adab-1ef336f0cbda" (UID: "3af4ac68-a437-4be7-adab-1ef336f0cbda"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.329569 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3af4ac68-a437-4be7-adab-1ef336f0cbda" (UID: "3af4ac68-a437-4be7-adab-1ef336f0cbda"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.338180 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-scripts" (OuterVolumeSpecName: "scripts") pod "643b18f8-6c85-43ec-977a-c9eade4db120" (UID: "643b18f8-6c85-43ec-977a-c9eade4db120"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.368910 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3af4ac68-a437-4be7-adab-1ef336f0cbda" (UID: "3af4ac68-a437-4be7-adab-1ef336f0cbda"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.410934 4837 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.411207 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.411276 4837 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/643b18f8-6c85-43ec-977a-c9eade4db120-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.411344 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/643b18f8-6c85-43ec-977a-c9eade4db120-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.411400 4837 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.411455 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.411522 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.411583 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7vwb\" (UniqueName: \"kubernetes.io/projected/3af4ac68-a437-4be7-adab-1ef336f0cbda-kube-api-access-w7vwb\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.411664 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zz8b\" (UniqueName: \"kubernetes.io/projected/643b18f8-6c85-43ec-977a-c9eade4db120-kube-api-access-9zz8b\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.445853 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.518132 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "643b18f8-6c85-43ec-977a-c9eade4db120" (UID: "643b18f8-6c85-43ec-977a-c9eade4db120"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.521871 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-7kqcz"] Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.522095 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" podUID="306aa5e9-7f77-4ff8-9cf6-5b3255c85337" containerName="dnsmasq-dns" containerID="cri-o://2f11bc74222520fcf554ba948fc1d1529fb608acebf234b92a60442a96bc720f" gracePeriod=10 Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.527065 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.612063 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-config-data" (OuterVolumeSpecName: "config-data") pod "3af4ac68-a437-4be7-adab-1ef336f0cbda" (UID: "3af4ac68-a437-4be7-adab-1ef336f0cbda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.612890 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.616899 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3af4ac68-a437-4be7-adab-1ef336f0cbda" (UID: "3af4ac68-a437-4be7-adab-1ef336f0cbda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.629191 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.629231 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.629241 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.643795 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-config-data" (OuterVolumeSpecName: "config-data") pod "643b18f8-6c85-43ec-977a-c9eade4db120" (UID: "643b18f8-6c85-43ec-977a-c9eade4db120"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.684209 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.731339 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.776778 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"643b18f8-6c85-43ec-977a-c9eade4db120","Type":"ContainerDied","Data":"bd322992210798258d691847d1f616bdd9cfed212e86bf6a55ea38cd76ee2987"} Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.776856 4837 scope.go:117] "RemoveContainer" containerID="483de9baf8ab7c417841c02e737142f145dc9ff207f9fb3cb2947353d68dc2a1" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.777103 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.801531 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7m97" event={"ID":"3af4ac68-a437-4be7-adab-1ef336f0cbda","Type":"ContainerDied","Data":"a67502d2e54c1c7cec9684d1027629e2f2e584a89eba555b6f508bab8dc6003d"} Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.801576 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a67502d2e54c1c7cec9684d1027629e2f2e584a89eba555b6f508bab8dc6003d" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.801654 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.816977 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0173ba9-535a-435d-bc51-75c069e69e46","Type":"ContainerStarted","Data":"a0b7b975f0a853ab5afecbe29fde34fc4210243637b54de91d96895e00f81e30"} Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.841963 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.856483 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.858588 4837 generic.go:334] "Generic (PLEG): container finished" podID="306aa5e9-7f77-4ff8-9cf6-5b3255c85337" containerID="2f11bc74222520fcf554ba948fc1d1529fb608acebf234b92a60442a96bc720f" exitCode=0 Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.858688 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" event={"ID":"306aa5e9-7f77-4ff8-9cf6-5b3255c85337","Type":"ContainerDied","Data":"2f11bc74222520fcf554ba948fc1d1529fb608acebf234b92a60442a96bc720f"} Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.862934 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c5479d889-t9mnp" event={"ID":"0faefca0-6038-4bdf-856e-b7cb5b6c5536","Type":"ContainerStarted","Data":"9a02a987a1d45aed6ebc32b498a9af8ccb4aa210832c48787a22a25a2228e529"} Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.862967 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c5479d889-t9mnp" event={"ID":"0faefca0-6038-4bdf-856e-b7cb5b6c5536","Type":"ContainerStarted","Data":"ee2f2c6cd7031c0c388b4947ca3445235863139c835bb92b8b4570fbe2c76095"} Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.864763 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee","Type":"ContainerStarted","Data":"367b20646b35759313f66e2deebf0c3f1def518ed4ba18cc4ba66cc774436167"} Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.877892 4837 scope.go:117] "RemoveContainer" containerID="45488696f1bb86187ce327f2ad0012c25e169de59468b4dda6460464fc5497ce" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.887081 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:07:05 crc kubenswrapper[4837]: E0313 12:07:05.887469 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="643b18f8-6c85-43ec-977a-c9eade4db120" containerName="glance-log" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.887498 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="643b18f8-6c85-43ec-977a-c9eade4db120" containerName="glance-log" Mar 13 12:07:05 crc kubenswrapper[4837]: E0313 12:07:05.887527 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="643b18f8-6c85-43ec-977a-c9eade4db120" containerName="glance-httpd" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.887533 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="643b18f8-6c85-43ec-977a-c9eade4db120" containerName="glance-httpd" Mar 13 12:07:05 crc kubenswrapper[4837]: E0313 12:07:05.887552 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3af4ac68-a437-4be7-adab-1ef336f0cbda" containerName="keystone-bootstrap" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.887558 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="3af4ac68-a437-4be7-adab-1ef336f0cbda" containerName="keystone-bootstrap" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.888275 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="643b18f8-6c85-43ec-977a-c9eade4db120" containerName="glance-httpd" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.888312 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="643b18f8-6c85-43ec-977a-c9eade4db120" containerName="glance-log" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.888335 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="3af4ac68-a437-4be7-adab-1ef336f0cbda" containerName="keystone-bootstrap" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.889486 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.899120 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.903144 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8vx8g" event={"ID":"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76","Type":"ContainerStarted","Data":"117a085c3636a60886a4974e5b0fb9b17907bfbb02c0f28e14e88a6a4aada355"} Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.903622 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.917518 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.993569 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-8vx8g" podStartSLOduration=2.794016875 podStartE2EDuration="38.993544499s" podCreationTimestamp="2026-03-13 12:06:27 +0000 UTC" firstStartedPulling="2026-03-13 12:06:28.886926421 +0000 UTC m=+1104.525193174" lastFinishedPulling="2026-03-13 12:07:05.086454035 +0000 UTC m=+1140.724720798" observedRunningTime="2026-03-13 12:07:05.991804974 +0000 UTC m=+1141.630071737" watchObservedRunningTime="2026-03-13 12:07:05.993544499 +0000 UTC m=+1141.631811252" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.061825 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.061929 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cp4n\" (UniqueName: \"kubernetes.io/projected/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-kube-api-access-4cp4n\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.061967 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.061989 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.062026 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-logs\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.062056 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.062104 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.062192 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.157445 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.157492 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.169731 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.169792 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.169853 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cp4n\" (UniqueName: \"kubernetes.io/projected/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-kube-api-access-4cp4n\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.169877 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.169892 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.169929 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-logs\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.169951 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.169996 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.171823 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5596f9dfb8-m9bxb" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.172072 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.174883 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-logs\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.176094 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.184536 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.190809 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.201798 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.250009 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.250040 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.252841 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-fd6ddfd9b-f66l8" podUID="4d3df345-07a2-41bf-aae4-088b3ce83b63" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.265191 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cp4n\" (UniqueName: \"kubernetes.io/projected/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-kube-api-access-4cp4n\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.270688 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.315534 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.387694 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-55dc4d44f8-mvjvg"] Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.389076 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.399435 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.399769 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-w6mdg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.399939 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.400121 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.400315 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.400364 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.411102 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55dc4d44f8-mvjvg"] Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.444042 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.477581 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-fernet-keys\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.477665 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-credential-keys\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.477693 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-internal-tls-certs\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.477726 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-public-tls-certs\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.477743 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-combined-ca-bundle\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.477770 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-config-data\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.477802 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tn5x\" (UniqueName: \"kubernetes.io/projected/9cb9614d-a433-4be3-8145-4c1c8593404f-kube-api-access-8tn5x\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.477842 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-scripts\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.485933 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.579780 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-ovsdbserver-sb\") pod \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.579882 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-ovsdbserver-nb\") pod \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.579915 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmp8d\" (UniqueName: \"kubernetes.io/projected/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-kube-api-access-nmp8d\") pod \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.579978 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-dns-swift-storage-0\") pod \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.580087 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-config\") pod \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.580143 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-dns-svc\") pod \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.580444 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tn5x\" (UniqueName: \"kubernetes.io/projected/9cb9614d-a433-4be3-8145-4c1c8593404f-kube-api-access-8tn5x\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.580510 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-scripts\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.580586 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-fernet-keys\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.580650 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-credential-keys\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.580682 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-internal-tls-certs\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.580726 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-public-tls-certs\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.580747 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-combined-ca-bundle\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.580783 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-config-data\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.588091 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-config-data\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.603458 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-credential-keys\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.605300 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-fernet-keys\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.606104 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-kube-api-access-nmp8d" (OuterVolumeSpecName: "kube-api-access-nmp8d") pod "306aa5e9-7f77-4ff8-9cf6-5b3255c85337" (UID: "306aa5e9-7f77-4ff8-9cf6-5b3255c85337"). InnerVolumeSpecName "kube-api-access-nmp8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.612665 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-internal-tls-certs\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.615310 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-scripts\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.615917 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-combined-ca-bundle\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.620086 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-public-tls-certs\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.644157 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tn5x\" (UniqueName: \"kubernetes.io/projected/9cb9614d-a433-4be3-8145-4c1c8593404f-kube-api-access-8tn5x\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.685870 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmp8d\" (UniqueName: \"kubernetes.io/projected/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-kube-api-access-nmp8d\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.686623 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "306aa5e9-7f77-4ff8-9cf6-5b3255c85337" (UID: "306aa5e9-7f77-4ff8-9cf6-5b3255c85337"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.733653 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "306aa5e9-7f77-4ff8-9cf6-5b3255c85337" (UID: "306aa5e9-7f77-4ff8-9cf6-5b3255c85337"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.743307 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "306aa5e9-7f77-4ff8-9cf6-5b3255c85337" (UID: "306aa5e9-7f77-4ff8-9cf6-5b3255c85337"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.754086 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "306aa5e9-7f77-4ff8-9cf6-5b3255c85337" (UID: "306aa5e9-7f77-4ff8-9cf6-5b3255c85337"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.757342 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.762497 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-config" (OuterVolumeSpecName: "config") pod "306aa5e9-7f77-4ff8-9cf6-5b3255c85337" (UID: "306aa5e9-7f77-4ff8-9cf6-5b3255c85337"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.787171 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.787352 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.787653 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.787746 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.787908 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.926936 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0173ba9-535a-435d-bc51-75c069e69e46","Type":"ContainerStarted","Data":"5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482"} Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.054025 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.122555 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="643b18f8-6c85-43ec-977a-c9eade4db120" path="/var/lib/kubelet/pods/643b18f8-6c85-43ec-977a-c9eade4db120/volumes" Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.132141 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.132175 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" event={"ID":"306aa5e9-7f77-4ff8-9cf6-5b3255c85337","Type":"ContainerDied","Data":"2a0f4fde059e2510bd13af9a796ea4745ff14474c756cbe8a1063e240ce40a71"} Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.132207 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c5479d889-t9mnp" event={"ID":"0faefca0-6038-4bdf-856e-b7cb5b6c5536","Type":"ContainerStarted","Data":"541466fc166402b9bfa4140bd97e50553b49c072d454b4f07847687fa559214e"} Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.132219 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-b6qnm" event={"ID":"95b808e7-674f-4592-af6e-f7c8682f6a17","Type":"ContainerStarted","Data":"ba3dda01a90b7b0d00508491184e90f099c7ae7bc849213376ebbc68b88ffd0f"} Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.134489 4837 scope.go:117] "RemoveContainer" containerID="2f11bc74222520fcf554ba948fc1d1529fb608acebf234b92a60442a96bc720f" Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.156031 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-7kqcz"] Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.168304 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-7kqcz"] Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.205209 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.226563 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-b6qnm" podStartSLOduration=3.20119477 podStartE2EDuration="40.226540153s" podCreationTimestamp="2026-03-13 12:06:27 +0000 UTC" firstStartedPulling="2026-03-13 12:06:28.873127308 +0000 UTC m=+1104.511394061" lastFinishedPulling="2026-03-13 12:07:05.898472681 +0000 UTC m=+1141.536739444" observedRunningTime="2026-03-13 12:07:07.187429304 +0000 UTC m=+1142.825696057" watchObservedRunningTime="2026-03-13 12:07:07.226540153 +0000 UTC m=+1142.864806916" Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.233968 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-c5479d889-t9mnp" podStartSLOduration=8.233951736 podStartE2EDuration="8.233951736s" podCreationTimestamp="2026-03-13 12:06:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:07.219380798 +0000 UTC m=+1142.857647561" watchObservedRunningTime="2026-03-13 12:07:07.233951736 +0000 UTC m=+1142.872218499" Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.270374 4837 scope.go:117] "RemoveContainer" containerID="a952d72f45aa2f65e1c6c2e7322bdcf16fb7324473b881a19caa21b16b66a760" Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.473395 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55dc4d44f8-mvjvg"] Mar 13 12:07:08 crc kubenswrapper[4837]: I0313 12:07:08.194906 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9fdb2289-943a-4078-ab5f-cab9a7b4faf1","Type":"ContainerStarted","Data":"4c6f80cedfefe6ca3ffa3fd1f8e5bca2af1a1e041ef15266c27ebfeb6b6939ec"} Mar 13 12:07:08 crc kubenswrapper[4837]: I0313 12:07:08.195424 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9fdb2289-943a-4078-ab5f-cab9a7b4faf1","Type":"ContainerStarted","Data":"9af1f1b6bae1b057a7c5b2be284aed718dd1bd53fd4267a097ec24a461a2d852"} Mar 13 12:07:08 crc kubenswrapper[4837]: I0313 12:07:08.200781 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0173ba9-535a-435d-bc51-75c069e69e46","Type":"ContainerStarted","Data":"06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6"} Mar 13 12:07:08 crc kubenswrapper[4837]: I0313 12:07:08.222742 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55dc4d44f8-mvjvg" event={"ID":"9cb9614d-a433-4be3-8145-4c1c8593404f","Type":"ContainerStarted","Data":"3294b982819f14d3b39958636e95d0c9c0debe4154e94196c3a7a5c537db54dd"} Mar 13 12:07:08 crc kubenswrapper[4837]: I0313 12:07:08.223156 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:08 crc kubenswrapper[4837]: I0313 12:07:08.223169 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55dc4d44f8-mvjvg" event={"ID":"9cb9614d-a433-4be3-8145-4c1c8593404f","Type":"ContainerStarted","Data":"74e03940743ec716715181a666394a63c07578634a654ed8b412096592d067ad"} Mar 13 12:07:08 crc kubenswrapper[4837]: I0313 12:07:08.234657 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.2346299 podStartE2EDuration="8.2346299s" podCreationTimestamp="2026-03-13 12:07:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:08.226469464 +0000 UTC m=+1143.864736227" watchObservedRunningTime="2026-03-13 12:07:08.2346299 +0000 UTC m=+1143.872896673" Mar 13 12:07:08 crc kubenswrapper[4837]: I0313 12:07:08.279210 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-55dc4d44f8-mvjvg" podStartSLOduration=2.27919109 podStartE2EDuration="2.27919109s" podCreationTimestamp="2026-03-13 12:07:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:08.263145957 +0000 UTC m=+1143.901412720" watchObservedRunningTime="2026-03-13 12:07:08.27919109 +0000 UTC m=+1143.917457853" Mar 13 12:07:09 crc kubenswrapper[4837]: I0313 12:07:09.060322 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="306aa5e9-7f77-4ff8-9cf6-5b3255c85337" path="/var/lib/kubelet/pods/306aa5e9-7f77-4ff8-9cf6-5b3255c85337/volumes" Mar 13 12:07:09 crc kubenswrapper[4837]: I0313 12:07:09.236266 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9fdb2289-943a-4078-ab5f-cab9a7b4faf1","Type":"ContainerStarted","Data":"a7e9d992e509609ea914f80658069ef20b3e4ab7548f88fd1489567b1ca63a1f"} Mar 13 12:07:09 crc kubenswrapper[4837]: I0313 12:07:09.238427 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qdzjz" event={"ID":"a44db1d6-6da2-41a5-a37f-ffc602f0d55a","Type":"ContainerStarted","Data":"843cf40344096a3f0565478be09bc819697f7ebe87515db62c711cd361ef6ce2"} Mar 13 12:07:09 crc kubenswrapper[4837]: I0313 12:07:09.264797 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.264772961 podStartE2EDuration="4.264772961s" podCreationTimestamp="2026-03-13 12:07:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:09.258883495 +0000 UTC m=+1144.897150268" watchObservedRunningTime="2026-03-13 12:07:09.264772961 +0000 UTC m=+1144.903039724" Mar 13 12:07:09 crc kubenswrapper[4837]: I0313 12:07:09.290715 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-qdzjz" podStartSLOduration=4.441583107 podStartE2EDuration="42.290692005s" podCreationTimestamp="2026-03-13 12:06:27 +0000 UTC" firstStartedPulling="2026-03-13 12:06:28.890160553 +0000 UTC m=+1104.528427316" lastFinishedPulling="2026-03-13 12:07:06.739269451 +0000 UTC m=+1142.377536214" observedRunningTime="2026-03-13 12:07:09.28161481 +0000 UTC m=+1144.919881573" watchObservedRunningTime="2026-03-13 12:07:09.290692005 +0000 UTC m=+1144.928958768" Mar 13 12:07:11 crc kubenswrapper[4837]: I0313 12:07:11.457203 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 12:07:11 crc kubenswrapper[4837]: I0313 12:07:11.457268 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 12:07:11 crc kubenswrapper[4837]: I0313 12:07:11.499987 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 12:07:11 crc kubenswrapper[4837]: I0313 12:07:11.515520 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 12:07:12 crc kubenswrapper[4837]: I0313 12:07:12.276924 4837 generic.go:334] "Generic (PLEG): container finished" podID="08c7b2a5-b0b8-433f-b55d-c64eaeea8b76" containerID="117a085c3636a60886a4974e5b0fb9b17907bfbb02c0f28e14e88a6a4aada355" exitCode=0 Mar 13 12:07:12 crc kubenswrapper[4837]: I0313 12:07:12.277017 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8vx8g" event={"ID":"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76","Type":"ContainerDied","Data":"117a085c3636a60886a4974e5b0fb9b17907bfbb02c0f28e14e88a6a4aada355"} Mar 13 12:07:12 crc kubenswrapper[4837]: I0313 12:07:12.277411 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 12:07:12 crc kubenswrapper[4837]: I0313 12:07:12.277445 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 12:07:14 crc kubenswrapper[4837]: I0313 12:07:14.958992 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.320789 4837 generic.go:334] "Generic (PLEG): container finished" podID="95b808e7-674f-4592-af6e-f7c8682f6a17" containerID="ba3dda01a90b7b0d00508491184e90f099c7ae7bc849213376ebbc68b88ffd0f" exitCode=0 Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.320844 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-b6qnm" event={"ID":"95b808e7-674f-4592-af6e-f7c8682f6a17","Type":"ContainerDied","Data":"ba3dda01a90b7b0d00508491184e90f099c7ae7bc849213376ebbc68b88ffd0f"} Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.507177 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.648166 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8vx8g" Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.732528 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-combined-ca-bundle\") pod \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.732684 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-config-data\") pod \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.733182 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vtt2\" (UniqueName: \"kubernetes.io/projected/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-kube-api-access-2vtt2\") pod \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.733228 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-scripts\") pod \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.733411 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-logs\") pod \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.735597 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-logs" (OuterVolumeSpecName: "logs") pod "08c7b2a5-b0b8-433f-b55d-c64eaeea8b76" (UID: "08c7b2a5-b0b8-433f-b55d-c64eaeea8b76"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.749079 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-scripts" (OuterVolumeSpecName: "scripts") pod "08c7b2a5-b0b8-433f-b55d-c64eaeea8b76" (UID: "08c7b2a5-b0b8-433f-b55d-c64eaeea8b76"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.756734 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-kube-api-access-2vtt2" (OuterVolumeSpecName: "kube-api-access-2vtt2") pod "08c7b2a5-b0b8-433f-b55d-c64eaeea8b76" (UID: "08c7b2a5-b0b8-433f-b55d-c64eaeea8b76"). InnerVolumeSpecName "kube-api-access-2vtt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.783341 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08c7b2a5-b0b8-433f-b55d-c64eaeea8b76" (UID: "08c7b2a5-b0b8-433f-b55d-c64eaeea8b76"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.840281 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vtt2\" (UniqueName: \"kubernetes.io/projected/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-kube-api-access-2vtt2\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.840315 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.840326 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.840334 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.848960 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-config-data" (OuterVolumeSpecName: "config-data") pod "08c7b2a5-b0b8-433f-b55d-c64eaeea8b76" (UID: "08c7b2a5-b0b8-433f-b55d-c64eaeea8b76"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.941881 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.155712 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5596f9dfb8-m9bxb" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.251176 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-fd6ddfd9b-f66l8" podUID="4d3df345-07a2-41bf-aae4-088b3ce83b63" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.346541 4837 generic.go:334] "Generic (PLEG): container finished" podID="a44db1d6-6da2-41a5-a37f-ffc602f0d55a" containerID="843cf40344096a3f0565478be09bc819697f7ebe87515db62c711cd361ef6ce2" exitCode=0 Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.346690 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qdzjz" event={"ID":"a44db1d6-6da2-41a5-a37f-ffc602f0d55a","Type":"ContainerDied","Data":"843cf40344096a3f0565478be09bc819697f7ebe87515db62c711cd361ef6ce2"} Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.350436 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8vx8g" event={"ID":"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76","Type":"ContainerDied","Data":"d8d4fa30fd1f227e47a679c4ebd48ddee761f9902a8c45ed343c205dc3f7e3b1"} Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.350480 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8vx8g" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.350490 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8d4fa30fd1f227e47a679c4ebd48ddee761f9902a8c45ed343c205dc3f7e3b1" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.487006 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.487059 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.552027 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.580818 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.784742 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.794378 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-59f7b5dc8d-rnsz6"] Mar 13 12:07:16 crc kubenswrapper[4837]: E0313 12:07:16.794777 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c7b2a5-b0b8-433f-b55d-c64eaeea8b76" containerName="placement-db-sync" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.794794 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c7b2a5-b0b8-433f-b55d-c64eaeea8b76" containerName="placement-db-sync" Mar 13 12:07:16 crc kubenswrapper[4837]: E0313 12:07:16.794817 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306aa5e9-7f77-4ff8-9cf6-5b3255c85337" containerName="init" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.794823 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="306aa5e9-7f77-4ff8-9cf6-5b3255c85337" containerName="init" Mar 13 12:07:16 crc kubenswrapper[4837]: E0313 12:07:16.794837 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306aa5e9-7f77-4ff8-9cf6-5b3255c85337" containerName="dnsmasq-dns" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.794844 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="306aa5e9-7f77-4ff8-9cf6-5b3255c85337" containerName="dnsmasq-dns" Mar 13 12:07:16 crc kubenswrapper[4837]: E0313 12:07:16.794864 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95b808e7-674f-4592-af6e-f7c8682f6a17" containerName="barbican-db-sync" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.794869 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b808e7-674f-4592-af6e-f7c8682f6a17" containerName="barbican-db-sync" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.795045 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c7b2a5-b0b8-433f-b55d-c64eaeea8b76" containerName="placement-db-sync" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.795067 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="95b808e7-674f-4592-af6e-f7c8682f6a17" containerName="barbican-db-sync" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.795079 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="306aa5e9-7f77-4ff8-9cf6-5b3255c85337" containerName="dnsmasq-dns" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.795986 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.798206 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.798512 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.798622 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.799309 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.799563 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-s6tq2" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.811981 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-59f7b5dc8d-rnsz6"] Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.858453 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95b808e7-674f-4592-af6e-f7c8682f6a17-combined-ca-bundle\") pod \"95b808e7-674f-4592-af6e-f7c8682f6a17\" (UID: \"95b808e7-674f-4592-af6e-f7c8682f6a17\") " Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.858581 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sh7t\" (UniqueName: \"kubernetes.io/projected/95b808e7-674f-4592-af6e-f7c8682f6a17-kube-api-access-9sh7t\") pod \"95b808e7-674f-4592-af6e-f7c8682f6a17\" (UID: \"95b808e7-674f-4592-af6e-f7c8682f6a17\") " Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.858734 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/95b808e7-674f-4592-af6e-f7c8682f6a17-db-sync-config-data\") pod \"95b808e7-674f-4592-af6e-f7c8682f6a17\" (UID: \"95b808e7-674f-4592-af6e-f7c8682f6a17\") " Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.859249 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-public-tls-certs\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.859342 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07eece9e-0e59-4a06-8fea-efb4217d6907-logs\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.859409 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-internal-tls-certs\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.859468 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-combined-ca-bundle\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.859567 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-scripts\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.859598 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-config-data\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.859629 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w87qw\" (UniqueName: \"kubernetes.io/projected/07eece9e-0e59-4a06-8fea-efb4217d6907-kube-api-access-w87qw\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.880301 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95b808e7-674f-4592-af6e-f7c8682f6a17-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "95b808e7-674f-4592-af6e-f7c8682f6a17" (UID: "95b808e7-674f-4592-af6e-f7c8682f6a17"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.902586 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95b808e7-674f-4592-af6e-f7c8682f6a17-kube-api-access-9sh7t" (OuterVolumeSpecName: "kube-api-access-9sh7t") pod "95b808e7-674f-4592-af6e-f7c8682f6a17" (UID: "95b808e7-674f-4592-af6e-f7c8682f6a17"). InnerVolumeSpecName "kube-api-access-9sh7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.931908 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95b808e7-674f-4592-af6e-f7c8682f6a17-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95b808e7-674f-4592-af6e-f7c8682f6a17" (UID: "95b808e7-674f-4592-af6e-f7c8682f6a17"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.961431 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-scripts\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.961619 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-config-data\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.961751 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w87qw\" (UniqueName: \"kubernetes.io/projected/07eece9e-0e59-4a06-8fea-efb4217d6907-kube-api-access-w87qw\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.961806 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-public-tls-certs\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.961870 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07eece9e-0e59-4a06-8fea-efb4217d6907-logs\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.961920 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-internal-tls-certs\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.961970 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-combined-ca-bundle\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.962050 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95b808e7-674f-4592-af6e-f7c8682f6a17-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.962074 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sh7t\" (UniqueName: \"kubernetes.io/projected/95b808e7-674f-4592-af6e-f7c8682f6a17-kube-api-access-9sh7t\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.962087 4837 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/95b808e7-674f-4592-af6e-f7c8682f6a17-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.965884 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07eece9e-0e59-4a06-8fea-efb4217d6907-logs\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.968622 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-combined-ca-bundle\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.970111 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-scripts\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.970261 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-internal-tls-certs\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.978260 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-public-tls-certs\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.983606 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w87qw\" (UniqueName: \"kubernetes.io/projected/07eece9e-0e59-4a06-8fea-efb4217d6907-kube-api-access-w87qw\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.986367 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-config-data\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.114860 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:17 crc kubenswrapper[4837]: E0313 12:07:17.173061 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.412106 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-b6qnm" event={"ID":"95b808e7-674f-4592-af6e-f7c8682f6a17","Type":"ContainerDied","Data":"02cdc5326e2dbc385d4e7090105a3655b6651929ef4db12950f0c379aaf98274"} Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.412147 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02cdc5326e2dbc385d4e7090105a3655b6651929ef4db12950f0c379aaf98274" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.412208 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.420697 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee","Type":"ContainerStarted","Data":"efca9fe013107e0157b7dfeec701a6bf70c8455183d2d8d806b63a4c79489237"} Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.420745 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.420868 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerName="ceilometer-notification-agent" containerID="cri-o://9c2af154abb9a37a270c00c3cc335b4994ab6bb24ddaf80f1f5bfc313a6b9fb6" gracePeriod=30 Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.421052 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.421089 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerName="proxy-httpd" containerID="cri-o://efca9fe013107e0157b7dfeec701a6bf70c8455183d2d8d806b63a4c79489237" gracePeriod=30 Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.421126 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerName="sg-core" containerID="cri-o://367b20646b35759313f66e2deebf0c3f1def518ed4ba18cc4ba66cc774436167" gracePeriod=30 Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.421490 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.723273 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6f4ff9ff9-mjmsz"] Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.731904 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.744508 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ktqxm" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.745065 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.745863 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.754322 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6f4ff9ff9-mjmsz"] Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.818446 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-58c489697d-dgjtz"] Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.820245 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.836267 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.894577 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55084c82-a823-4f31-926e-21702ba02ba1-combined-ca-bundle\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.894626 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-config-data\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.894670 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55084c82-a823-4f31-926e-21702ba02ba1-config-data\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.894689 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp6xz\" (UniqueName: \"kubernetes.io/projected/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-kube-api-access-vp6xz\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.894771 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55084c82-a823-4f31-926e-21702ba02ba1-config-data-custom\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.894789 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ktjq\" (UniqueName: \"kubernetes.io/projected/55084c82-a823-4f31-926e-21702ba02ba1-kube-api-access-4ktjq\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.894807 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-combined-ca-bundle\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.894829 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55084c82-a823-4f31-926e-21702ba02ba1-logs\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.894849 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-config-data-custom\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.894883 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-logs\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.954314 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-59f7b5dc8d-rnsz6"] Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.979714 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-nn655"] Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.981305 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.006358 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55084c82-a823-4f31-926e-21702ba02ba1-config-data-custom\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.006443 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ktjq\" (UniqueName: \"kubernetes.io/projected/55084c82-a823-4f31-926e-21702ba02ba1-kube-api-access-4ktjq\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.006484 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-combined-ca-bundle\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.006530 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55084c82-a823-4f31-926e-21702ba02ba1-logs\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.006565 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-config-data-custom\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.006657 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-logs\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.006768 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55084c82-a823-4f31-926e-21702ba02ba1-combined-ca-bundle\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.006803 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-config-data\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.006848 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55084c82-a823-4f31-926e-21702ba02ba1-config-data\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.006877 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp6xz\" (UniqueName: \"kubernetes.io/projected/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-kube-api-access-vp6xz\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.007733 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55084c82-a823-4f31-926e-21702ba02ba1-logs\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.010948 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-logs\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.016157 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-config-data-custom\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.016456 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55084c82-a823-4f31-926e-21702ba02ba1-config-data\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.016756 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-58c489697d-dgjtz"] Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.018066 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55084c82-a823-4f31-926e-21702ba02ba1-combined-ca-bundle\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.029733 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-config-data\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.030499 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-combined-ca-bundle\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.044459 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55084c82-a823-4f31-926e-21702ba02ba1-config-data-custom\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.058238 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ktjq\" (UniqueName: \"kubernetes.io/projected/55084c82-a823-4f31-926e-21702ba02ba1-kube-api-access-4ktjq\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.075575 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.078527 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp6xz\" (UniqueName: \"kubernetes.io/projected/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-kube-api-access-vp6xz\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.102156 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-nn655"] Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.116878 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.116963 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.117035 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hhlz\" (UniqueName: \"kubernetes.io/projected/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-kube-api-access-9hhlz\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.117124 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.117167 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-dns-svc\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.117211 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-config\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.180619 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7598d89cd4-qfmh9"] Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.206184 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7598d89cd4-qfmh9"] Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.206292 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.212117 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.218836 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.218924 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.219055 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hhlz\" (UniqueName: \"kubernetes.io/projected/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-kube-api-access-9hhlz\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.219206 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.219260 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-dns-svc\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.219331 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-config\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.220175 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.220672 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.226517 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.227197 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-dns-svc\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.227512 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-config\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.242953 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hhlz\" (UniqueName: \"kubernetes.io/projected/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-kube-api-access-9hhlz\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.317138 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.323245 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx5rf\" (UniqueName: \"kubernetes.io/projected/91206ea2-5d2b-478d-983e-6c842f02819b-kube-api-access-bx5rf\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.323294 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-config-data-custom\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.323313 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-config-data\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.323366 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91206ea2-5d2b-478d-983e-6c842f02819b-logs\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.323395 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-combined-ca-bundle\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.349288 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.384636 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.426431 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-config-data\") pod \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.427208 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-etc-machine-id\") pod \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.427242 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-combined-ca-bundle\") pod \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.427273 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-scripts\") pod \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.427330 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p5k2\" (UniqueName: \"kubernetes.io/projected/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-kube-api-access-4p5k2\") pod \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.427404 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-db-sync-config-data\") pod \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.427622 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx5rf\" (UniqueName: \"kubernetes.io/projected/91206ea2-5d2b-478d-983e-6c842f02819b-kube-api-access-bx5rf\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.427733 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-config-data-custom\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.427758 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-config-data\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.427832 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91206ea2-5d2b-478d-983e-6c842f02819b-logs\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.427874 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-combined-ca-bundle\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.433184 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-combined-ca-bundle\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.433234 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a44db1d6-6da2-41a5-a37f-ffc602f0d55a" (UID: "a44db1d6-6da2-41a5-a37f-ffc602f0d55a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.433555 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91206ea2-5d2b-478d-983e-6c842f02819b-logs\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.434885 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a44db1d6-6da2-41a5-a37f-ffc602f0d55a" (UID: "a44db1d6-6da2-41a5-a37f-ffc602f0d55a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.439577 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-config-data-custom\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.450883 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-kube-api-access-4p5k2" (OuterVolumeSpecName: "kube-api-access-4p5k2") pod "a44db1d6-6da2-41a5-a37f-ffc602f0d55a" (UID: "a44db1d6-6da2-41a5-a37f-ffc602f0d55a"). InnerVolumeSpecName "kube-api-access-4p5k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.458064 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-config-data\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.473863 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-scripts" (OuterVolumeSpecName: "scripts") pod "a44db1d6-6da2-41a5-a37f-ffc602f0d55a" (UID: "a44db1d6-6da2-41a5-a37f-ffc602f0d55a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.480625 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx5rf\" (UniqueName: \"kubernetes.io/projected/91206ea2-5d2b-478d-983e-6c842f02819b-kube-api-access-bx5rf\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.528514 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qdzjz" event={"ID":"a44db1d6-6da2-41a5-a37f-ffc602f0d55a","Type":"ContainerDied","Data":"d87408c4f80f070da48980a1c0c42ec26d6e0f566d37471876ae97d32157796e"} Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.528884 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d87408c4f80f070da48980a1c0c42ec26d6e0f566d37471876ae97d32157796e" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.528988 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.529455 4837 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.529482 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.529491 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p5k2\" (UniqueName: \"kubernetes.io/projected/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-kube-api-access-4p5k2\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.529500 4837 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.537841 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a44db1d6-6da2-41a5-a37f-ffc602f0d55a" (UID: "a44db1d6-6da2-41a5-a37f-ffc602f0d55a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.539982 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59f7b5dc8d-rnsz6" event={"ID":"07eece9e-0e59-4a06-8fea-efb4217d6907","Type":"ContainerStarted","Data":"790e0c04a60b645164e78de86a6f7c1dd04d1f00a81d1f05a478395fa2197f78"} Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.565319 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.593623 4837 generic.go:334] "Generic (PLEG): container finished" podID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerID="efca9fe013107e0157b7dfeec701a6bf70c8455183d2d8d806b63a4c79489237" exitCode=0 Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.593671 4837 generic.go:334] "Generic (PLEG): container finished" podID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerID="367b20646b35759313f66e2deebf0c3f1def518ed4ba18cc4ba66cc774436167" exitCode=2 Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.594164 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee","Type":"ContainerDied","Data":"efca9fe013107e0157b7dfeec701a6bf70c8455183d2d8d806b63a4c79489237"} Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.594231 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee","Type":"ContainerDied","Data":"367b20646b35759313f66e2deebf0c3f1def518ed4ba18cc4ba66cc774436167"} Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.632101 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.663724 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:07:18 crc kubenswrapper[4837]: E0313 12:07:18.664125 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a44db1d6-6da2-41a5-a37f-ffc602f0d55a" containerName="cinder-db-sync" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.664138 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a44db1d6-6da2-41a5-a37f-ffc602f0d55a" containerName="cinder-db-sync" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.664356 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a44db1d6-6da2-41a5-a37f-ffc602f0d55a" containerName="cinder-db-sync" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.665290 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.668423 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.670865 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-config-data" (OuterVolumeSpecName: "config-data") pod "a44db1d6-6da2-41a5-a37f-ffc602f0d55a" (UID: "a44db1d6-6da2-41a5-a37f-ffc602f0d55a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.710208 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.733783 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.734206 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-scripts\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.734248 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.734385 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-config-data\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.734441 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwghc\" (UniqueName: \"kubernetes.io/projected/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-kube-api-access-qwghc\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.734481 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.734622 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.767408 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6f4ff9ff9-mjmsz"] Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.835712 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-config-data\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.835785 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwghc\" (UniqueName: \"kubernetes.io/projected/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-kube-api-access-qwghc\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.835814 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.835878 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.835910 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-scripts\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.835936 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.836780 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.850291 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.850902 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.851916 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-config-data\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.853797 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-scripts\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.890370 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwghc\" (UniqueName: \"kubernetes.io/projected/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-kube-api-access-qwghc\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.897392 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-nn655"] Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.938574 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.958280 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-txzkw"] Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.973183 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.978202 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-txzkw"] Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.990970 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.992756 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.997121 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040155 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-config-data-custom\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040212 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-scripts\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040238 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040274 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppxfm\" (UniqueName: \"kubernetes.io/projected/6f484085-7b83-46a8-80c2-b3ef6f8b8798-kube-api-access-ppxfm\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040312 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040378 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f484085-7b83-46a8-80c2-b3ef6f8b8798-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040422 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f484085-7b83-46a8-80c2-b3ef6f8b8798-logs\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040473 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040505 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-config\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040528 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2wh2\" (UniqueName: \"kubernetes.io/projected/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-kube-api-access-s2wh2\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040551 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040585 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040605 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-config-data\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.095523 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.141823 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f484085-7b83-46a8-80c2-b3ef6f8b8798-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.141877 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f484085-7b83-46a8-80c2-b3ef6f8b8798-logs\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.141920 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.141975 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2wh2\" (UniqueName: \"kubernetes.io/projected/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-kube-api-access-s2wh2\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.141992 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-config\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.142011 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.142029 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.142045 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-config-data\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.142094 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-config-data-custom\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.142114 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-scripts\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.142143 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.142172 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppxfm\" (UniqueName: \"kubernetes.io/projected/6f484085-7b83-46a8-80c2-b3ef6f8b8798-kube-api-access-ppxfm\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.142196 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.143049 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.143582 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.143600 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-config\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.143696 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f484085-7b83-46a8-80c2-b3ef6f8b8798-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.144099 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.144721 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f484085-7b83-46a8-80c2-b3ef6f8b8798-logs\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.145779 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.162416 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-config-data-custom\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.162673 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.164286 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-config-data\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.166534 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-scripts\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.169139 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2wh2\" (UniqueName: \"kubernetes.io/projected/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-kube-api-access-s2wh2\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.183294 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppxfm\" (UniqueName: \"kubernetes.io/projected/6f484085-7b83-46a8-80c2-b3ef6f8b8798-kube-api-access-ppxfm\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.207888 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-nn655"] Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.213387 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-58c489697d-dgjtz"] Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.355096 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.403327 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.569689 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7598d89cd4-qfmh9"] Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.610381 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59f7b5dc8d-rnsz6" event={"ID":"07eece9e-0e59-4a06-8fea-efb4217d6907","Type":"ContainerStarted","Data":"6a04090f3a67fe0e7b4a52cc36393d294f41e3eefc5aa787e9d8b0ac7104fabd"} Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.610422 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59f7b5dc8d-rnsz6" event={"ID":"07eece9e-0e59-4a06-8fea-efb4217d6907","Type":"ContainerStarted","Data":"08f34771db4517922084f9af36c9fc7b53eda6e562f0e4d0c248471961304247"} Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.610899 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.610918 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.614440 4837 generic.go:334] "Generic (PLEG): container finished" podID="8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc" containerID="1ba1ecdd77454b316a59d76a9628041bd162e97b7a353cd219789d008b6bfecc" exitCode=0 Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.614831 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-nn655" event={"ID":"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc","Type":"ContainerDied","Data":"1ba1ecdd77454b316a59d76a9628041bd162e97b7a353cd219789d008b6bfecc"} Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.614866 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-nn655" event={"ID":"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc","Type":"ContainerStarted","Data":"0c767985dc8b487e12a79302a8fc49009a12a9085c34846f811ba339c130fda9"} Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.620102 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" event={"ID":"55084c82-a823-4f31-926e-21702ba02ba1","Type":"ContainerStarted","Data":"c2b48c97eeb59bbfb8ae4e79f9743fb8b22a8899c906ed3b96615efa53b32d03"} Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.622896 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" event={"ID":"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9","Type":"ContainerStarted","Data":"22b99983d4298366daa1ac0a7327f79a61b6b54e0d37feba4a924c3afab8ca2c"} Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.622927 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.622949 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.665318 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-59f7b5dc8d-rnsz6" podStartSLOduration=3.665297548 podStartE2EDuration="3.665297548s" podCreationTimestamp="2026-03-13 12:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:19.644375241 +0000 UTC m=+1155.282642004" watchObservedRunningTime="2026-03-13 12:07:19.665297548 +0000 UTC m=+1155.303564311" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.754895 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.115905 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-txzkw"] Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.269199 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.361529 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.517976 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-ovsdbserver-nb\") pod \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.518048 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-config\") pod \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.518175 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-ovsdbserver-sb\") pod \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.518194 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-dns-swift-storage-0\") pod \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.518228 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hhlz\" (UniqueName: \"kubernetes.io/projected/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-kube-api-access-9hhlz\") pod \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.518305 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-dns-svc\") pod \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.564059 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc" (UID: "8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.567183 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc" (UID: "8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.571238 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc" (UID: "8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.572800 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-kube-api-access-9hhlz" (OuterVolumeSpecName: "kube-api-access-9hhlz") pod "8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc" (UID: "8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc"). InnerVolumeSpecName "kube-api-access-9hhlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.596503 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-config" (OuterVolumeSpecName: "config") pod "8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc" (UID: "8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.611709 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc" (UID: "8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.622906 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.622947 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.622961 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hhlz\" (UniqueName: \"kubernetes.io/projected/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-kube-api-access-9hhlz\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.622976 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.622987 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.622999 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.666166 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7598d89cd4-qfmh9" event={"ID":"91206ea2-5d2b-478d-983e-6c842f02819b","Type":"ContainerStarted","Data":"f929d8442913bfaecc7956e8f7c394bf2287e01e6f101666b06a41edc759a582"} Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.666209 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7598d89cd4-qfmh9" event={"ID":"91206ea2-5d2b-478d-983e-6c842f02819b","Type":"ContainerStarted","Data":"104e38d91432a24be429666c7aef47a48dc5e37624f7f42d829e3d5a83308ad5"} Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.666220 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7598d89cd4-qfmh9" event={"ID":"91206ea2-5d2b-478d-983e-6c842f02819b","Type":"ContainerStarted","Data":"e401c09fc39f0377fdb0e13cc3564c85b21b640ae75df7edda1290f89d0c1fda"} Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.667180 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.667238 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.671850 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6f484085-7b83-46a8-80c2-b3ef6f8b8798","Type":"ContainerStarted","Data":"2608f88642291363e7163567a42948a0027f5da1a879663defaa6a0c943729b9"} Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.675802 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"de6b1e01-3054-46d9-b2f3-a8f3a7e504af","Type":"ContainerStarted","Data":"036fe40da00c951a03639436f1d70f870b431fe6b0431a148132bcc8ff154aeb"} Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.694020 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-nn655" event={"ID":"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc","Type":"ContainerDied","Data":"0c767985dc8b487e12a79302a8fc49009a12a9085c34846f811ba339c130fda9"} Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.694083 4837 scope.go:117] "RemoveContainer" containerID="1ba1ecdd77454b316a59d76a9628041bd162e97b7a353cd219789d008b6bfecc" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.694239 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.700007 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7598d89cd4-qfmh9" podStartSLOduration=2.699983241 podStartE2EDuration="2.699983241s" podCreationTimestamp="2026-03-13 12:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:20.68469284 +0000 UTC m=+1156.322959623" watchObservedRunningTime="2026-03-13 12:07:20.699983241 +0000 UTC m=+1156.338250004" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.732895 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" event={"ID":"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b","Type":"ContainerStarted","Data":"18aeb282fbd8558fc7f2a4d93c502285e6ae25649a3f62cf2708ff5492d7993d"} Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.732937 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" event={"ID":"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b","Type":"ContainerStarted","Data":"3b4bbdde4e1a36119cc27a40f2a694902d8b5f53fa6c902b59c1385e734f5a5e"} Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.770374 4837 generic.go:334] "Generic (PLEG): container finished" podID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerID="9c2af154abb9a37a270c00c3cc335b4994ab6bb24ddaf80f1f5bfc313a6b9fb6" exitCode=0 Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.773740 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee","Type":"ContainerDied","Data":"9c2af154abb9a37a270c00c3cc335b4994ab6bb24ddaf80f1f5bfc313a6b9fb6"} Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.821065 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-nn655"] Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.835415 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-nn655"] Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.080964 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.095828 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc" path="/var/lib/kubelet/pods/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc/volumes" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.241480 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r7pn\" (UniqueName: \"kubernetes.io/projected/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-kube-api-access-6r7pn\") pod \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.241628 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-combined-ca-bundle\") pod \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.241704 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-sg-core-conf-yaml\") pod \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.241780 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-log-httpd\") pod \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.241799 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-run-httpd\") pod \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.241878 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-scripts\") pod \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.241912 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-config-data\") pod \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.243193 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" (UID: "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.243893 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" (UID: "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.251816 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-kube-api-access-6r7pn" (OuterVolumeSpecName: "kube-api-access-6r7pn") pod "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" (UID: "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee"). InnerVolumeSpecName "kube-api-access-6r7pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.257822 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-scripts" (OuterVolumeSpecName: "scripts") pod "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" (UID: "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.287953 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" (UID: "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.334897 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" (UID: "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.344966 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r7pn\" (UniqueName: \"kubernetes.io/projected/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-kube-api-access-6r7pn\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.345044 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.345068 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.345078 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.345089 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.345096 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.363187 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-config-data" (OuterVolumeSpecName: "config-data") pod "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" (UID: "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.449248 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.462843 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.464094 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.465825 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.791655 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.791662 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee","Type":"ContainerDied","Data":"2fe508c1e7b8efe966205eebb2665129d9e9d777f425ce11141b713c93504dc7"} Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.791809 4837 scope.go:117] "RemoveContainer" containerID="efca9fe013107e0157b7dfeec701a6bf70c8455183d2d8d806b63a4c79489237" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.798686 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6f484085-7b83-46a8-80c2-b3ef6f8b8798","Type":"ContainerStarted","Data":"bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5"} Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.807072 4837 generic.go:334] "Generic (PLEG): container finished" podID="fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" containerID="18aeb282fbd8558fc7f2a4d93c502285e6ae25649a3f62cf2708ff5492d7993d" exitCode=0 Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.808292 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" event={"ID":"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b","Type":"ContainerDied","Data":"18aeb282fbd8558fc7f2a4d93c502285e6ae25649a3f62cf2708ff5492d7993d"} Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.931323 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.959759 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.968925 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:07:21 crc kubenswrapper[4837]: E0313 12:07:21.969916 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerName="ceilometer-notification-agent" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.969938 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerName="ceilometer-notification-agent" Mar 13 12:07:21 crc kubenswrapper[4837]: E0313 12:07:21.969983 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc" containerName="init" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.969992 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc" containerName="init" Mar 13 12:07:21 crc kubenswrapper[4837]: E0313 12:07:21.970006 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerName="sg-core" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.970014 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerName="sg-core" Mar 13 12:07:21 crc kubenswrapper[4837]: E0313 12:07:21.970032 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerName="proxy-httpd" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.970040 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerName="proxy-httpd" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.970254 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerName="ceilometer-notification-agent" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.970277 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerName="sg-core" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.970295 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerName="proxy-httpd" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.970314 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc" containerName="init" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.972840 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.977215 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.978867 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.979994 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.064650 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8944c2be-da67-4cdd-9f75-0e473253e932-log-httpd\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.065540 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-scripts\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.066084 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.066258 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz8t7\" (UniqueName: \"kubernetes.io/projected/8944c2be-da67-4cdd-9f75-0e473253e932-kube-api-access-cz8t7\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.066296 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-config-data\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.066423 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.066552 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8944c2be-da67-4cdd-9f75-0e473253e932-run-httpd\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.168305 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-scripts\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.168487 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.168535 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz8t7\" (UniqueName: \"kubernetes.io/projected/8944c2be-da67-4cdd-9f75-0e473253e932-kube-api-access-cz8t7\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.168566 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-config-data\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.168596 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.168618 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8944c2be-da67-4cdd-9f75-0e473253e932-run-httpd\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.168659 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8944c2be-da67-4cdd-9f75-0e473253e932-log-httpd\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.169104 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8944c2be-da67-4cdd-9f75-0e473253e932-log-httpd\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.171224 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8944c2be-da67-4cdd-9f75-0e473253e932-run-httpd\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.176388 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-config-data\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.176844 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.179650 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-scripts\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.180683 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.194166 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz8t7\" (UniqueName: \"kubernetes.io/projected/8944c2be-da67-4cdd-9f75-0e473253e932-kube-api-access-cz8t7\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.337980 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.393904 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.596510 4837 scope.go:117] "RemoveContainer" containerID="367b20646b35759313f66e2deebf0c3f1def518ed4ba18cc4ba66cc774436167" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.757706 4837 scope.go:117] "RemoveContainer" containerID="9c2af154abb9a37a270c00c3cc335b4994ab6bb24ddaf80f1f5bfc313a6b9fb6" Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.073075 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" path="/var/lib/kubelet/pods/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee/volumes" Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.381988 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.845119 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8944c2be-da67-4cdd-9f75-0e473253e932","Type":"ContainerStarted","Data":"affa40a245268506c6f6766fb2f158d46986fd7f106dd4cfb003b265c6f1faa4"} Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.850059 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"de6b1e01-3054-46d9-b2f3-a8f3a7e504af","Type":"ContainerStarted","Data":"a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc"} Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.852747 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" event={"ID":"55084c82-a823-4f31-926e-21702ba02ba1","Type":"ContainerStarted","Data":"44b1cd8a58b692c49a76e5f57bb432f41dc1c5c54cebee1805c8fa67be55ed5c"} Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.852798 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" event={"ID":"55084c82-a823-4f31-926e-21702ba02ba1","Type":"ContainerStarted","Data":"c54851290685ce709d0e6de4969cfb9edf3402a8ad802731045343b1d7b59d2a"} Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.855176 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" event={"ID":"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9","Type":"ContainerStarted","Data":"149e91272542ec915648f0494b9e7a35f69d4dd526e3c7ef28a873474b5326e9"} Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.855213 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" event={"ID":"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9","Type":"ContainerStarted","Data":"2da0bfaeffcd62165fc74c1d17cbbde1617d888708f62aa9c7c0915aba58a23a"} Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.858768 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" event={"ID":"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b","Type":"ContainerStarted","Data":"308f5a2ca30c72015ad1831a239549e973a6a698921b4916b0e838cdf0b49c8a"} Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.858884 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.878212 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" podStartSLOduration=3.049866951 podStartE2EDuration="6.87819637s" podCreationTimestamp="2026-03-13 12:07:17 +0000 UTC" firstStartedPulling="2026-03-13 12:07:18.97576274 +0000 UTC m=+1154.614029503" lastFinishedPulling="2026-03-13 12:07:22.804092159 +0000 UTC m=+1158.442358922" observedRunningTime="2026-03-13 12:07:23.877081696 +0000 UTC m=+1159.515348469" watchObservedRunningTime="2026-03-13 12:07:23.87819637 +0000 UTC m=+1159.516463133" Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.883892 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6f484085-7b83-46a8-80c2-b3ef6f8b8798","Type":"ContainerStarted","Data":"4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d"} Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.884063 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6f484085-7b83-46a8-80c2-b3ef6f8b8798" containerName="cinder-api-log" containerID="cri-o://bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5" gracePeriod=30 Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.884207 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.884261 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6f484085-7b83-46a8-80c2-b3ef6f8b8798" containerName="cinder-api" containerID="cri-o://4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d" gracePeriod=30 Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.906660 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" podStartSLOduration=3.382571377 podStartE2EDuration="6.906620944s" podCreationTimestamp="2026-03-13 12:07:17 +0000 UTC" firstStartedPulling="2026-03-13 12:07:19.234349516 +0000 UTC m=+1154.872616279" lastFinishedPulling="2026-03-13 12:07:22.758399083 +0000 UTC m=+1158.396665846" observedRunningTime="2026-03-13 12:07:23.904010802 +0000 UTC m=+1159.542277565" watchObservedRunningTime="2026-03-13 12:07:23.906620944 +0000 UTC m=+1159.544887717" Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.944194 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" podStartSLOduration=5.944175244 podStartE2EDuration="5.944175244s" podCreationTimestamp="2026-03-13 12:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:23.943360658 +0000 UTC m=+1159.581627411" watchObservedRunningTime="2026-03-13 12:07:23.944175244 +0000 UTC m=+1159.582442007" Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.988161 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.9881429950000005 podStartE2EDuration="5.988142995s" podCreationTimestamp="2026-03-13 12:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:23.975064985 +0000 UTC m=+1159.613331748" watchObservedRunningTime="2026-03-13 12:07:23.988142995 +0000 UTC m=+1159.626409758" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.713632 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6d84f6b8c8-8rrwq"] Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.716224 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.721537 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.721854 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.754918 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6d84f6b8c8-8rrwq"] Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.872569 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74c7e377-b579-47bc-a992-cca0cf047627-logs\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.872617 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-config-data-custom\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.872653 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d459r\" (UniqueName: \"kubernetes.io/projected/74c7e377-b579-47bc-a992-cca0cf047627-kube-api-access-d459r\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.872819 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-combined-ca-bundle\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.872990 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-public-tls-certs\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.873053 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-internal-tls-certs\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.873213 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-config-data\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.904920 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"de6b1e01-3054-46d9-b2f3-a8f3a7e504af","Type":"ContainerStarted","Data":"997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb"} Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.909186 4837 generic.go:334] "Generic (PLEG): container finished" podID="6f484085-7b83-46a8-80c2-b3ef6f8b8798" containerID="bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5" exitCode=143 Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.909264 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6f484085-7b83-46a8-80c2-b3ef6f8b8798","Type":"ContainerDied","Data":"bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5"} Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.913610 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8944c2be-da67-4cdd-9f75-0e473253e932","Type":"ContainerStarted","Data":"f106579d1eb92efbafe377b2c5e41ffb980fcd44573e4b8ba73109499680b552"} Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.926726 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.060688637 podStartE2EDuration="6.926708648s" podCreationTimestamp="2026-03-13 12:07:18 +0000 UTC" firstStartedPulling="2026-03-13 12:07:19.824601443 +0000 UTC m=+1155.462868206" lastFinishedPulling="2026-03-13 12:07:22.690621454 +0000 UTC m=+1158.328888217" observedRunningTime="2026-03-13 12:07:24.920343387 +0000 UTC m=+1160.558610150" watchObservedRunningTime="2026-03-13 12:07:24.926708648 +0000 UTC m=+1160.564975411" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.975246 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-internal-tls-certs\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.975374 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-config-data\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.975432 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74c7e377-b579-47bc-a992-cca0cf047627-logs\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.975460 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-config-data-custom\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.975479 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d459r\" (UniqueName: \"kubernetes.io/projected/74c7e377-b579-47bc-a992-cca0cf047627-kube-api-access-d459r\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.975526 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-combined-ca-bundle\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.975579 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-public-tls-certs\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.976732 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74c7e377-b579-47bc-a992-cca0cf047627-logs\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.982197 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-public-tls-certs\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.985900 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-config-data\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.987964 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-internal-tls-certs\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.997662 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-config-data-custom\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.998378 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-combined-ca-bundle\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.999313 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d459r\" (UniqueName: \"kubernetes.io/projected/74c7e377-b579-47bc-a992-cca0cf047627-kube-api-access-d459r\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:25 crc kubenswrapper[4837]: I0313 12:07:25.063156 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:25 crc kubenswrapper[4837]: W0313 12:07:25.318898 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e1cc75f_7386_440f_ba9f_9c3fd7b7d4ee.slice/crio-367b20646b35759313f66e2deebf0c3f1def518ed4ba18cc4ba66cc774436167.scope WatchSource:0}: Error finding container 367b20646b35759313f66e2deebf0c3f1def518ed4ba18cc4ba66cc774436167: Status 404 returned error can't find the container with id 367b20646b35759313f66e2deebf0c3f1def518ed4ba18cc4ba66cc774436167 Mar 13 12:07:25 crc kubenswrapper[4837]: W0313 12:07:25.329127 4837 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda44db1d6_6da2_41a5_a37f_ffc602f0d55a.slice/crio-conmon-843cf40344096a3f0565478be09bc819697f7ebe87515db62c711cd361ef6ce2.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda44db1d6_6da2_41a5_a37f_ffc602f0d55a.slice/crio-conmon-843cf40344096a3f0565478be09bc819697f7ebe87515db62c711cd361ef6ce2.scope: no such file or directory Mar 13 12:07:25 crc kubenswrapper[4837]: W0313 12:07:25.329182 4837 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda44db1d6_6da2_41a5_a37f_ffc602f0d55a.slice/crio-843cf40344096a3f0565478be09bc819697f7ebe87515db62c711cd361ef6ce2.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda44db1d6_6da2_41a5_a37f_ffc602f0d55a.slice/crio-843cf40344096a3f0565478be09bc819697f7ebe87515db62c711cd361ef6ce2.scope: no such file or directory Mar 13 12:07:25 crc kubenswrapper[4837]: W0313 12:07:25.353942 4837 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e1cc75f_7386_440f_ba9f_9c3fd7b7d4ee.slice/crio-conmon-efca9fe013107e0157b7dfeec701a6bf70c8455183d2d8d806b63a4c79489237.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e1cc75f_7386_440f_ba9f_9c3fd7b7d4ee.slice/crio-conmon-efca9fe013107e0157b7dfeec701a6bf70c8455183d2d8d806b63a4c79489237.scope: no such file or directory Mar 13 12:07:25 crc kubenswrapper[4837]: W0313 12:07:25.353988 4837 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e1cc75f_7386_440f_ba9f_9c3fd7b7d4ee.slice/crio-efca9fe013107e0157b7dfeec701a6bf70c8455183d2d8d806b63a4c79489237.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e1cc75f_7386_440f_ba9f_9c3fd7b7d4ee.slice/crio-efca9fe013107e0157b7dfeec701a6bf70c8455183d2d8d806b63a4c79489237.scope: no such file or directory Mar 13 12:07:25 crc kubenswrapper[4837]: W0313 12:07:25.386466 4837 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8105a7ee_7d4e_471a_a39d_b3b9b75c3dcc.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8105a7ee_7d4e_471a_a39d_b3b9b75c3dcc.slice: no such file or directory Mar 13 12:07:25 crc kubenswrapper[4837]: E0313 12:07:25.647653 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08c7b2a5_b0b8_433f_b55d_c64eaeea8b76.slice/crio-117a085c3636a60886a4974e5b0fb9b17907bfbb02c0f28e14e88a6a4aada355.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95b808e7_674f_4592_af6e_f7c8682f6a17.slice/crio-ba3dda01a90b7b0d00508491184e90f099c7ae7bc849213376ebbc68b88ffd0f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08c7b2a5_b0b8_433f_b55d_c64eaeea8b76.slice/crio-conmon-117a085c3636a60886a4974e5b0fb9b17907bfbb02c0f28e14e88a6a4aada355.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08c7b2a5_b0b8_433f_b55d_c64eaeea8b76.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95b808e7_674f_4592_af6e_f7c8682f6a17.slice/crio-conmon-ba3dda01a90b7b0d00508491184e90f099c7ae7bc849213376ebbc68b88ffd0f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda44db1d6_6da2_41a5_a37f_ffc602f0d55a.slice/crio-d87408c4f80f070da48980a1c0c42ec26d6e0f566d37471876ae97d32157796e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95b808e7_674f_4592_af6e_f7c8682f6a17.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08c7b2a5_b0b8_433f_b55d_c64eaeea8b76.slice/crio-d8d4fa30fd1f227e47a679c4ebd48ddee761f9902a8c45ed343c205dc3f7e3b1\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95b808e7_674f_4592_af6e_f7c8682f6a17.slice/crio-02cdc5326e2dbc385d4e7090105a3655b6651929ef4db12950f0c379aaf98274\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e1cc75f_7386_440f_ba9f_9c3fd7b7d4ee.slice/crio-conmon-367b20646b35759313f66e2deebf0c3f1def518ed4ba18cc4ba66cc774436167.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda44db1d6_6da2_41a5_a37f_ffc602f0d55a.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:07:25 crc kubenswrapper[4837]: I0313 12:07:25.657429 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:07:25 crc kubenswrapper[4837]: I0313 12:07:25.684736 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6d84f6b8c8-8rrwq"] Mar 13 12:07:25 crc kubenswrapper[4837]: I0313 12:07:25.885341 4837 scope.go:117] "RemoveContainer" containerID="1a04d5901dd1375cafd0fc584ce462f13000b8c9b02a1c2603aedb866420cd51" Mar 13 12:07:25 crc kubenswrapper[4837]: I0313 12:07:25.971421 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c5479d889-t9mnp"] Mar 13 12:07:25 crc kubenswrapper[4837]: I0313 12:07:25.971716 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-c5479d889-t9mnp" podUID="0faefca0-6038-4bdf-856e-b7cb5b6c5536" containerName="neutron-api" containerID="cri-o://9a02a987a1d45aed6ebc32b498a9af8ccb4aa210832c48787a22a25a2228e529" gracePeriod=30 Mar 13 12:07:25 crc kubenswrapper[4837]: I0313 12:07:25.972470 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-c5479d889-t9mnp" podUID="0faefca0-6038-4bdf-856e-b7cb5b6c5536" containerName="neutron-httpd" containerID="cri-o://541466fc166402b9bfa4140bd97e50553b49c072d454b4f07847687fa559214e" gracePeriod=30 Mar 13 12:07:25 crc kubenswrapper[4837]: I0313 12:07:25.992018 4837 generic.go:334] "Generic (PLEG): container finished" podID="5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" containerID="7f77bd5a27791856de608c8a08f3c83e1663f61407b889e9328671983bac96ca" exitCode=137 Mar 13 12:07:25 crc kubenswrapper[4837]: I0313 12:07:25.992051 4837 generic.go:334] "Generic (PLEG): container finished" podID="5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" containerID="b196a9394882e394baaf7222251dcba129911dfdfe911b4d1d679d89adbed206" exitCode=137 Mar 13 12:07:25 crc kubenswrapper[4837]: I0313 12:07:25.992138 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c6787dc45-zbdfx" event={"ID":"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14","Type":"ContainerDied","Data":"7f77bd5a27791856de608c8a08f3c83e1663f61407b889e9328671983bac96ca"} Mar 13 12:07:25 crc kubenswrapper[4837]: I0313 12:07:25.992170 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c6787dc45-zbdfx" event={"ID":"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14","Type":"ContainerDied","Data":"b196a9394882e394baaf7222251dcba129911dfdfe911b4d1d679d89adbed206"} Mar 13 12:07:25 crc kubenswrapper[4837]: I0313 12:07:25.997098 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-667d547b9-4p8qm"] Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:25.999189 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.005867 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d84f6b8c8-8rrwq" event={"ID":"74c7e377-b579-47bc-a992-cca0cf047627","Type":"ContainerStarted","Data":"cf495db3422cf0d35cd836e716531a2384380d9bbe3980dd393e4bcfb3fe6343"} Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.036749 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-667d547b9-4p8qm"] Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.039766 4837 generic.go:334] "Generic (PLEG): container finished" podID="1f2afb5c-bfb2-4349-8000-4c0c90892d56" containerID="ccf4fdc9606b0ae8a6ecc82badd31da8c6fddc1f4294bee13d5805f8da627b43" exitCode=137 Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.039885 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b5f9b5c85-p584g" event={"ID":"1f2afb5c-bfb2-4349-8000-4c0c90892d56","Type":"ContainerDied","Data":"ccf4fdc9606b0ae8a6ecc82badd31da8c6fddc1f4294bee13d5805f8da627b43"} Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.039918 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b5f9b5c85-p584g" event={"ID":"1f2afb5c-bfb2-4349-8000-4c0c90892d56","Type":"ContainerDied","Data":"18955fe5d50dd684cdf6370fe66929e8470b6ef70461302b53ed56fa3595256c"} Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.039932 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18955fe5d50dd684cdf6370fe66929e8470b6ef70461302b53ed56fa3595256c" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.074513 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.081003 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8944c2be-da67-4cdd-9f75-0e473253e932","Type":"ContainerStarted","Data":"343420b862af8b30fbf01c83c65d52d9d2faba010cdc819f2b823e8a9b058006"} Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.099042 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.109094 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-c5479d889-t9mnp" podUID="0faefca0-6038-4bdf-856e-b7cb5b6c5536" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.158:9696/\": read tcp 10.217.0.2:41566->10.217.0.158:9696: read: connection reset by peer" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.131717 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scc7s\" (UniqueName: \"kubernetes.io/projected/1f2afb5c-bfb2-4349-8000-4c0c90892d56-kube-api-access-scc7s\") pod \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.131794 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f2afb5c-bfb2-4349-8000-4c0c90892d56-scripts\") pod \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.131853 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f2afb5c-bfb2-4349-8000-4c0c90892d56-logs\") pod \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.132007 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1f2afb5c-bfb2-4349-8000-4c0c90892d56-horizon-secret-key\") pod \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.132041 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1f2afb5c-bfb2-4349-8000-4c0c90892d56-config-data\") pod \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.132355 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-combined-ca-bundle\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.132446 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-internal-tls-certs\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.132514 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-public-tls-certs\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.132544 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-ovndb-tls-certs\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.132595 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-config\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.132619 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wngx\" (UniqueName: \"kubernetes.io/projected/3c00dfc0-061b-43ba-b529-a89c9157a0cf-kube-api-access-9wngx\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.132662 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-httpd-config\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.146274 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f2afb5c-bfb2-4349-8000-4c0c90892d56-logs" (OuterVolumeSpecName: "logs") pod "1f2afb5c-bfb2-4349-8000-4c0c90892d56" (UID: "1f2afb5c-bfb2-4349-8000-4c0c90892d56"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.148888 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f2afb5c-bfb2-4349-8000-4c0c90892d56-kube-api-access-scc7s" (OuterVolumeSpecName: "kube-api-access-scc7s") pod "1f2afb5c-bfb2-4349-8000-4c0c90892d56" (UID: "1f2afb5c-bfb2-4349-8000-4c0c90892d56"). InnerVolumeSpecName "kube-api-access-scc7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.153895 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f2afb5c-bfb2-4349-8000-4c0c90892d56-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1f2afb5c-bfb2-4349-8000-4c0c90892d56" (UID: "1f2afb5c-bfb2-4349-8000-4c0c90892d56"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.212113 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f2afb5c-bfb2-4349-8000-4c0c90892d56-config-data" (OuterVolumeSpecName: "config-data") pod "1f2afb5c-bfb2-4349-8000-4c0c90892d56" (UID: "1f2afb5c-bfb2-4349-8000-4c0c90892d56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.231787 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f2afb5c-bfb2-4349-8000-4c0c90892d56-scripts" (OuterVolumeSpecName: "scripts") pod "1f2afb5c-bfb2-4349-8000-4c0c90892d56" (UID: "1f2afb5c-bfb2-4349-8000-4c0c90892d56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.233811 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-horizon-secret-key\") pod \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.233940 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-config-data\") pod \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.233993 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-scripts\") pod \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.234100 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v796v\" (UniqueName: \"kubernetes.io/projected/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-kube-api-access-v796v\") pod \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.234210 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-logs\") pod \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.234451 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-combined-ca-bundle\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.234582 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-internal-tls-certs\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.234764 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-public-tls-certs\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.234860 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-ovndb-tls-certs\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.234934 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-config\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.234955 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wngx\" (UniqueName: \"kubernetes.io/projected/3c00dfc0-061b-43ba-b529-a89c9157a0cf-kube-api-access-9wngx\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.234991 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-httpd-config\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.235146 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scc7s\" (UniqueName: \"kubernetes.io/projected/1f2afb5c-bfb2-4349-8000-4c0c90892d56-kube-api-access-scc7s\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.235161 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f2afb5c-bfb2-4349-8000-4c0c90892d56-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.235173 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f2afb5c-bfb2-4349-8000-4c0c90892d56-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.235184 4837 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1f2afb5c-bfb2-4349-8000-4c0c90892d56-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.235195 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1f2afb5c-bfb2-4349-8000-4c0c90892d56-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.236569 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-logs" (OuterVolumeSpecName: "logs") pod "5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" (UID: "5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.242258 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-config\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.266662 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" (UID: "5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.271890 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-kube-api-access-v796v" (OuterVolumeSpecName: "kube-api-access-v796v") pod "5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" (UID: "5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14"). InnerVolumeSpecName "kube-api-access-v796v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.275226 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-httpd-config\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.275370 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-internal-tls-certs\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.275426 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-combined-ca-bundle\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.287847 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-public-tls-certs\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.293399 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-ovndb-tls-certs\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.294380 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wngx\" (UniqueName: \"kubernetes.io/projected/3c00dfc0-061b-43ba-b529-a89c9157a0cf-kube-api-access-9wngx\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.320993 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-config-data" (OuterVolumeSpecName: "config-data") pod "5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" (UID: "5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.323807 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-scripts" (OuterVolumeSpecName: "scripts") pod "5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" (UID: "5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.337245 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.337317 4837 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.337334 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.337350 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.337364 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v796v\" (UniqueName: \"kubernetes.io/projected/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-kube-api-access-v796v\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.388538 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.027014 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-667d547b9-4p8qm"] Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.097256 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8944c2be-da67-4cdd-9f75-0e473253e932","Type":"ContainerStarted","Data":"7d867af8873a7e0421fd52164c9458573cf6ac8847b38f845cf622f104ceb41b"} Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.107995 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c6787dc45-zbdfx" event={"ID":"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14","Type":"ContainerDied","Data":"c8e41db64721802eb9e2d30e33b7feaf3f233822df5127e44d2dee0b5f64ca8a"} Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.108231 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.108263 4837 scope.go:117] "RemoveContainer" containerID="7f77bd5a27791856de608c8a08f3c83e1663f61407b889e9328671983bac96ca" Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.113036 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d84f6b8c8-8rrwq" event={"ID":"74c7e377-b579-47bc-a992-cca0cf047627","Type":"ContainerStarted","Data":"bda4417fc8cd38d3600910af72f405f811adc75b43760e4deafc89dbb5440630"} Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.113082 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d84f6b8c8-8rrwq" event={"ID":"74c7e377-b579-47bc-a992-cca0cf047627","Type":"ContainerStarted","Data":"f764b12423b04c656194dce53df7362f4f08a5d964fb08ede56b50cd03df2f6b"} Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.113295 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.113347 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.124122 4837 generic.go:334] "Generic (PLEG): container finished" podID="0faefca0-6038-4bdf-856e-b7cb5b6c5536" containerID="541466fc166402b9bfa4140bd97e50553b49c072d454b4f07847687fa559214e" exitCode=0 Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.124219 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c5479d889-t9mnp" event={"ID":"0faefca0-6038-4bdf-856e-b7cb5b6c5536","Type":"ContainerDied","Data":"541466fc166402b9bfa4140bd97e50553b49c072d454b4f07847687fa559214e"} Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.129073 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.131295 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-667d547b9-4p8qm" event={"ID":"3c00dfc0-061b-43ba-b529-a89c9157a0cf","Type":"ContainerStarted","Data":"752e078a847a78d0c6486ef91987f5453c24a2b19503c1b1179258d77a7a485b"} Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.145422 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6d84f6b8c8-8rrwq" podStartSLOduration=3.145405275 podStartE2EDuration="3.145405275s" podCreationTimestamp="2026-03-13 12:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:27.136055952 +0000 UTC m=+1162.774322715" watchObservedRunningTime="2026-03-13 12:07:27.145405275 +0000 UTC m=+1162.783672038" Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.283238 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6b5f9b5c85-p584g"] Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.298137 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6b5f9b5c85-p584g"] Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.306709 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-c6787dc45-zbdfx"] Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.313940 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-c6787dc45-zbdfx"] Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.427842 4837 scope.go:117] "RemoveContainer" containerID="b196a9394882e394baaf7222251dcba129911dfdfe911b4d1d679d89adbed206" Mar 13 12:07:28 crc kubenswrapper[4837]: I0313 12:07:28.151868 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-667d547b9-4p8qm" event={"ID":"3c00dfc0-061b-43ba-b529-a89c9157a0cf","Type":"ContainerStarted","Data":"a0dcaabad8d1b5fb470055b19ec292f56bcf50b4090047b7711c1d3f3cea96e6"} Mar 13 12:07:28 crc kubenswrapper[4837]: I0313 12:07:28.152495 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-667d547b9-4p8qm" event={"ID":"3c00dfc0-061b-43ba-b529-a89c9157a0cf","Type":"ContainerStarted","Data":"1779fe880258500d895a7213d8cb917cc09aed870df6ff19a3eb464702b779bb"} Mar 13 12:07:28 crc kubenswrapper[4837]: I0313 12:07:28.182492 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-667d547b9-4p8qm" podStartSLOduration=3.182475064 podStartE2EDuration="3.182475064s" podCreationTimestamp="2026-03-13 12:07:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:28.175452252 +0000 UTC m=+1163.813719015" watchObservedRunningTime="2026-03-13 12:07:28.182475064 +0000 UTC m=+1163.820741827" Mar 13 12:07:28 crc kubenswrapper[4837]: E0313 12:07:28.377870 4837 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/9a25fa6b22bb0edd887bfaf8ac40afc80451636491068fad146b7092f8766aa1/diff" to get inode usage: stat /var/lib/containers/storage/overlay/9a25fa6b22bb0edd887bfaf8ac40afc80451636491068fad146b7092f8766aa1/diff: no such file or directory, extraDiskErr: Mar 13 12:07:28 crc kubenswrapper[4837]: I0313 12:07:28.939436 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.062400 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f2afb5c-bfb2-4349-8000-4c0c90892d56" path="/var/lib/kubelet/pods/1f2afb5c-bfb2-4349-8000-4c0c90892d56/volumes" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.063551 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" path="/var/lib/kubelet/pods/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14/volumes" Mar 13 12:07:29 crc kubenswrapper[4837]: E0313 12:07:29.083083 4837 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.187914 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8944c2be-da67-4cdd-9f75-0e473253e932","Type":"ContainerStarted","Data":"317a9b585064c8564bcd2ae43ba40834f6b1cd25a3d83d32f956502b5b280276"} Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.188786 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.211060 4837 generic.go:334] "Generic (PLEG): container finished" podID="0faefca0-6038-4bdf-856e-b7cb5b6c5536" containerID="9a02a987a1d45aed6ebc32b498a9af8ccb4aa210832c48787a22a25a2228e529" exitCode=0 Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.212363 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c5479d889-t9mnp" event={"ID":"0faefca0-6038-4bdf-856e-b7cb5b6c5536","Type":"ContainerDied","Data":"9a02a987a1d45aed6ebc32b498a9af8ccb4aa210832c48787a22a25a2228e529"} Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.212417 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.267751 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.307582 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.705845236 podStartE2EDuration="8.307565097s" podCreationTimestamp="2026-03-13 12:07:21 +0000 UTC" firstStartedPulling="2026-03-13 12:07:23.444414269 +0000 UTC m=+1159.082681032" lastFinishedPulling="2026-03-13 12:07:28.04613412 +0000 UTC m=+1163.684400893" observedRunningTime="2026-03-13 12:07:29.219118158 +0000 UTC m=+1164.857384921" watchObservedRunningTime="2026-03-13 12:07:29.307565097 +0000 UTC m=+1164.945831860" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.337988 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.355774 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.357800 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.410173 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.411369 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-httpd-config\") pod \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.411426 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-ovndb-tls-certs\") pod \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.411519 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-config\") pod \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.411580 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vh2b\" (UniqueName: \"kubernetes.io/projected/0faefca0-6038-4bdf-856e-b7cb5b6c5536-kube-api-access-4vh2b\") pod \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.411856 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-combined-ca-bundle\") pod \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.411950 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-public-tls-certs\") pod \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.411983 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-internal-tls-certs\") pod \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.502174 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0faefca0-6038-4bdf-856e-b7cb5b6c5536" (UID: "0faefca0-6038-4bdf-856e-b7cb5b6c5536"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.517053 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0faefca0-6038-4bdf-856e-b7cb5b6c5536-kube-api-access-4vh2b" (OuterVolumeSpecName: "kube-api-access-4vh2b") pod "0faefca0-6038-4bdf-856e-b7cb5b6c5536" (UID: "0faefca0-6038-4bdf-856e-b7cb5b6c5536"). InnerVolumeSpecName "kube-api-access-4vh2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.523937 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vh2b\" (UniqueName: \"kubernetes.io/projected/0faefca0-6038-4bdf-856e-b7cb5b6c5536-kube-api-access-4vh2b\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.523996 4837 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.558902 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0faefca0-6038-4bdf-856e-b7cb5b6c5536" (UID: "0faefca0-6038-4bdf-856e-b7cb5b6c5536"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.573169 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0faefca0-6038-4bdf-856e-b7cb5b6c5536" (UID: "0faefca0-6038-4bdf-856e-b7cb5b6c5536"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.588725 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t9gtj"] Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.589120 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" podUID="e06de12b-6071-4dce-81f1-68539347ca19" containerName="dnsmasq-dns" containerID="cri-o://3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764" gracePeriod=10 Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.589769 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0faefca0-6038-4bdf-856e-b7cb5b6c5536" (UID: "0faefca0-6038-4bdf-856e-b7cb5b6c5536"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.614985 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-config" (OuterVolumeSpecName: "config") pod "0faefca0-6038-4bdf-856e-b7cb5b6c5536" (UID: "0faefca0-6038-4bdf-856e-b7cb5b6c5536"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.627295 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.627344 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.627357 4837 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.627369 4837 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.634020 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.661936 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0faefca0-6038-4bdf-856e-b7cb5b6c5536" (UID: "0faefca0-6038-4bdf-856e-b7cb5b6c5536"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.730823 4837 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.174141 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.228863 4837 generic.go:334] "Generic (PLEG): container finished" podID="e06de12b-6071-4dce-81f1-68539347ca19" containerID="3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764" exitCode=0 Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.228924 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" event={"ID":"e06de12b-6071-4dce-81f1-68539347ca19","Type":"ContainerDied","Data":"3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764"} Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.228952 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" event={"ID":"e06de12b-6071-4dce-81f1-68539347ca19","Type":"ContainerDied","Data":"bd08737ad8dd4994dc887a5676bf16b9103265c49a66d4c535944bbd694008c2"} Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.228967 4837 scope.go:117] "RemoveContainer" containerID="3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.229071 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.239979 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.240765 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c5479d889-t9mnp" event={"ID":"0faefca0-6038-4bdf-856e-b7cb5b6c5536","Type":"ContainerDied","Data":"ee2f2c6cd7031c0c388b4947ca3445235863139c835bb92b8b4570fbe2c76095"} Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.240990 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="de6b1e01-3054-46d9-b2f3-a8f3a7e504af" containerName="cinder-scheduler" containerID="cri-o://a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc" gracePeriod=30 Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.241177 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="de6b1e01-3054-46d9-b2f3-a8f3a7e504af" containerName="probe" containerID="cri-o://997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb" gracePeriod=30 Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.241970 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-dns-swift-storage-0\") pod \"e06de12b-6071-4dce-81f1-68539347ca19\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.242085 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-ovsdbserver-sb\") pod \"e06de12b-6071-4dce-81f1-68539347ca19\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.242124 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-dns-svc\") pod \"e06de12b-6071-4dce-81f1-68539347ca19\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.242255 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r558r\" (UniqueName: \"kubernetes.io/projected/e06de12b-6071-4dce-81f1-68539347ca19-kube-api-access-r558r\") pod \"e06de12b-6071-4dce-81f1-68539347ca19\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.242298 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-config\") pod \"e06de12b-6071-4dce-81f1-68539347ca19\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.242332 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-ovsdbserver-nb\") pod \"e06de12b-6071-4dce-81f1-68539347ca19\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.263867 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e06de12b-6071-4dce-81f1-68539347ca19-kube-api-access-r558r" (OuterVolumeSpecName: "kube-api-access-r558r") pod "e06de12b-6071-4dce-81f1-68539347ca19" (UID: "e06de12b-6071-4dce-81f1-68539347ca19"). InnerVolumeSpecName "kube-api-access-r558r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.317378 4837 scope.go:117] "RemoveContainer" containerID="059e96d08021e09252961e7fbfcfd1f264e2e10514bec4760c10d5076b6990a3" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.336303 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-config" (OuterVolumeSpecName: "config") pod "e06de12b-6071-4dce-81f1-68539347ca19" (UID: "e06de12b-6071-4dce-81f1-68539347ca19"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.350363 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r558r\" (UniqueName: \"kubernetes.io/projected/e06de12b-6071-4dce-81f1-68539347ca19-kube-api-access-r558r\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.350403 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.381825 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e06de12b-6071-4dce-81f1-68539347ca19" (UID: "e06de12b-6071-4dce-81f1-68539347ca19"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.405144 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e06de12b-6071-4dce-81f1-68539347ca19" (UID: "e06de12b-6071-4dce-81f1-68539347ca19"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.416725 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c5479d889-t9mnp"] Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.420138 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e06de12b-6071-4dce-81f1-68539347ca19" (UID: "e06de12b-6071-4dce-81f1-68539347ca19"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.431957 4837 scope.go:117] "RemoveContainer" containerID="3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.432221 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e06de12b-6071-4dce-81f1-68539347ca19" (UID: "e06de12b-6071-4dce-81f1-68539347ca19"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:30 crc kubenswrapper[4837]: E0313 12:07:30.435742 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764\": container with ID starting with 3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764 not found: ID does not exist" containerID="3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.435796 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764"} err="failed to get container status \"3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764\": rpc error: code = NotFound desc = could not find container \"3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764\": container with ID starting with 3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764 not found: ID does not exist" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.435834 4837 scope.go:117] "RemoveContainer" containerID="059e96d08021e09252961e7fbfcfd1f264e2e10514bec4760c10d5076b6990a3" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.438001 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c5479d889-t9mnp"] Mar 13 12:07:30 crc kubenswrapper[4837]: E0313 12:07:30.439764 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"059e96d08021e09252961e7fbfcfd1f264e2e10514bec4760c10d5076b6990a3\": container with ID starting with 059e96d08021e09252961e7fbfcfd1f264e2e10514bec4760c10d5076b6990a3 not found: ID does not exist" containerID="059e96d08021e09252961e7fbfcfd1f264e2e10514bec4760c10d5076b6990a3" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.439803 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"059e96d08021e09252961e7fbfcfd1f264e2e10514bec4760c10d5076b6990a3"} err="failed to get container status \"059e96d08021e09252961e7fbfcfd1f264e2e10514bec4760c10d5076b6990a3\": rpc error: code = NotFound desc = could not find container \"059e96d08021e09252961e7fbfcfd1f264e2e10514bec4760c10d5076b6990a3\": container with ID starting with 059e96d08021e09252961e7fbfcfd1f264e2e10514bec4760c10d5076b6990a3 not found: ID does not exist" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.439824 4837 scope.go:117] "RemoveContainer" containerID="541466fc166402b9bfa4140bd97e50553b49c072d454b4f07847687fa559214e" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.455133 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.455204 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.455218 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.455238 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.464841 4837 scope.go:117] "RemoveContainer" containerID="9a02a987a1d45aed6ebc32b498a9af8ccb4aa210832c48787a22a25a2228e529" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.560129 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t9gtj"] Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.572622 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t9gtj"] Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.723026 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:31 crc kubenswrapper[4837]: I0313 12:07:31.062370 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0faefca0-6038-4bdf-856e-b7cb5b6c5536" path="/var/lib/kubelet/pods/0faefca0-6038-4bdf-856e-b7cb5b6c5536/volumes" Mar 13 12:07:31 crc kubenswrapper[4837]: I0313 12:07:31.063384 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e06de12b-6071-4dce-81f1-68539347ca19" path="/var/lib/kubelet/pods/e06de12b-6071-4dce-81f1-68539347ca19/volumes" Mar 13 12:07:31 crc kubenswrapper[4837]: I0313 12:07:31.293570 4837 generic.go:334] "Generic (PLEG): container finished" podID="de6b1e01-3054-46d9-b2f3-a8f3a7e504af" containerID="997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb" exitCode=0 Mar 13 12:07:31 crc kubenswrapper[4837]: I0313 12:07:31.293631 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"de6b1e01-3054-46d9-b2f3-a8f3a7e504af","Type":"ContainerDied","Data":"997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb"} Mar 13 12:07:31 crc kubenswrapper[4837]: I0313 12:07:31.364243 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:31 crc kubenswrapper[4837]: I0313 12:07:31.705796 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:07:32 crc kubenswrapper[4837]: I0313 12:07:32.124214 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:07:32 crc kubenswrapper[4837]: I0313 12:07:32.209762 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5596f9dfb8-m9bxb"] Mar 13 12:07:32 crc kubenswrapper[4837]: I0313 12:07:32.314578 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5596f9dfb8-m9bxb" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon-log" containerID="cri-o://92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654" gracePeriod=30 Mar 13 12:07:32 crc kubenswrapper[4837]: I0313 12:07:32.314994 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5596f9dfb8-m9bxb" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon" containerID="cri-o://7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb" gracePeriod=30 Mar 13 12:07:32 crc kubenswrapper[4837]: I0313 12:07:32.880987 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.000106 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.135470 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwghc\" (UniqueName: \"kubernetes.io/projected/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-kube-api-access-qwghc\") pod \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.136143 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-etc-machine-id\") pod \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.136339 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-combined-ca-bundle\") pod \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.136394 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-scripts\") pod \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.136395 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "de6b1e01-3054-46d9-b2f3-a8f3a7e504af" (UID: "de6b1e01-3054-46d9-b2f3-a8f3a7e504af"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.136447 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-config-data\") pod \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.136542 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-config-data-custom\") pod \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.137546 4837 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.145982 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-kube-api-access-qwghc" (OuterVolumeSpecName: "kube-api-access-qwghc") pod "de6b1e01-3054-46d9-b2f3-a8f3a7e504af" (UID: "de6b1e01-3054-46d9-b2f3-a8f3a7e504af"). InnerVolumeSpecName "kube-api-access-qwghc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.150792 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "de6b1e01-3054-46d9-b2f3-a8f3a7e504af" (UID: "de6b1e01-3054-46d9-b2f3-a8f3a7e504af"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.157781 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-scripts" (OuterVolumeSpecName: "scripts") pod "de6b1e01-3054-46d9-b2f3-a8f3a7e504af" (UID: "de6b1e01-3054-46d9-b2f3-a8f3a7e504af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.219410 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de6b1e01-3054-46d9-b2f3-a8f3a7e504af" (UID: "de6b1e01-3054-46d9-b2f3-a8f3a7e504af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.238937 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwghc\" (UniqueName: \"kubernetes.io/projected/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-kube-api-access-qwghc\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.238976 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.238988 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.239000 4837 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.250363 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-config-data" (OuterVolumeSpecName: "config-data") pod "de6b1e01-3054-46d9-b2f3-a8f3a7e504af" (UID: "de6b1e01-3054-46d9-b2f3-a8f3a7e504af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.322602 4837 generic.go:334] "Generic (PLEG): container finished" podID="de6b1e01-3054-46d9-b2f3-a8f3a7e504af" containerID="a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc" exitCode=0 Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.322677 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"de6b1e01-3054-46d9-b2f3-a8f3a7e504af","Type":"ContainerDied","Data":"a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc"} Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.322718 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"de6b1e01-3054-46d9-b2f3-a8f3a7e504af","Type":"ContainerDied","Data":"036fe40da00c951a03639436f1d70f870b431fe6b0431a148132bcc8ff154aeb"} Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.322754 4837 scope.go:117] "RemoveContainer" containerID="997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.323429 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.340427 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.342058 4837 scope.go:117] "RemoveContainer" containerID="a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.368600 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.368840 4837 scope.go:117] "RemoveContainer" containerID="997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb" Mar 13 12:07:33 crc kubenswrapper[4837]: E0313 12:07:33.369489 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb\": container with ID starting with 997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb not found: ID does not exist" containerID="997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.369600 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb"} err="failed to get container status \"997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb\": rpc error: code = NotFound desc = could not find container \"997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb\": container with ID starting with 997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb not found: ID does not exist" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.369751 4837 scope.go:117] "RemoveContainer" containerID="a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc" Mar 13 12:07:33 crc kubenswrapper[4837]: E0313 12:07:33.372101 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc\": container with ID starting with a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc not found: ID does not exist" containerID="a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.372143 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc"} err="failed to get container status \"a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc\": rpc error: code = NotFound desc = could not find container \"a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc\": container with ID starting with a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc not found: ID does not exist" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.379431 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.389193 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:07:33 crc kubenswrapper[4837]: E0313 12:07:33.389606 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6b1e01-3054-46d9-b2f3-a8f3a7e504af" containerName="probe" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.389667 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6b1e01-3054-46d9-b2f3-a8f3a7e504af" containerName="probe" Mar 13 12:07:33 crc kubenswrapper[4837]: E0313 12:07:33.389688 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0faefca0-6038-4bdf-856e-b7cb5b6c5536" containerName="neutron-api" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.389697 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0faefca0-6038-4bdf-856e-b7cb5b6c5536" containerName="neutron-api" Mar 13 12:07:33 crc kubenswrapper[4837]: E0313 12:07:33.389711 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6b1e01-3054-46d9-b2f3-a8f3a7e504af" containerName="cinder-scheduler" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.389718 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6b1e01-3054-46d9-b2f3-a8f3a7e504af" containerName="cinder-scheduler" Mar 13 12:07:33 crc kubenswrapper[4837]: E0313 12:07:33.389742 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e06de12b-6071-4dce-81f1-68539347ca19" containerName="dnsmasq-dns" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.389749 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e06de12b-6071-4dce-81f1-68539347ca19" containerName="dnsmasq-dns" Mar 13 12:07:33 crc kubenswrapper[4837]: E0313 12:07:33.389764 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0faefca0-6038-4bdf-856e-b7cb5b6c5536" containerName="neutron-httpd" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.389771 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0faefca0-6038-4bdf-856e-b7cb5b6c5536" containerName="neutron-httpd" Mar 13 12:07:33 crc kubenswrapper[4837]: E0313 12:07:33.389788 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" containerName="horizon" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.389795 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" containerName="horizon" Mar 13 12:07:33 crc kubenswrapper[4837]: E0313 12:07:33.389814 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" containerName="horizon-log" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.389819 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" containerName="horizon-log" Mar 13 12:07:33 crc kubenswrapper[4837]: E0313 12:07:33.389831 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f2afb5c-bfb2-4349-8000-4c0c90892d56" containerName="horizon" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.389837 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f2afb5c-bfb2-4349-8000-4c0c90892d56" containerName="horizon" Mar 13 12:07:33 crc kubenswrapper[4837]: E0313 12:07:33.389848 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e06de12b-6071-4dce-81f1-68539347ca19" containerName="init" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.389854 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e06de12b-6071-4dce-81f1-68539347ca19" containerName="init" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.390014 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="de6b1e01-3054-46d9-b2f3-a8f3a7e504af" containerName="cinder-scheduler" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.390024 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e06de12b-6071-4dce-81f1-68539347ca19" containerName="dnsmasq-dns" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.390033 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f2afb5c-bfb2-4349-8000-4c0c90892d56" containerName="horizon" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.390048 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" containerName="horizon" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.390055 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="de6b1e01-3054-46d9-b2f3-a8f3a7e504af" containerName="probe" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.390066 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" containerName="horizon-log" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.390077 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0faefca0-6038-4bdf-856e-b7cb5b6c5536" containerName="neutron-httpd" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.390087 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0faefca0-6038-4bdf-856e-b7cb5b6c5536" containerName="neutron-api" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.391048 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.396462 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.411864 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.441952 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/580b8861-16eb-4142-bd61-6d0221a07f4d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.442119 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/580b8861-16eb-4142-bd61-6d0221a07f4d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.442184 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5q2b\" (UniqueName: \"kubernetes.io/projected/580b8861-16eb-4142-bd61-6d0221a07f4d-kube-api-access-f5q2b\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.442393 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/580b8861-16eb-4142-bd61-6d0221a07f4d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.442592 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/580b8861-16eb-4142-bd61-6d0221a07f4d-config-data\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.442675 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/580b8861-16eb-4142-bd61-6d0221a07f4d-scripts\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.544057 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/580b8861-16eb-4142-bd61-6d0221a07f4d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.544226 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/580b8861-16eb-4142-bd61-6d0221a07f4d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.544291 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5q2b\" (UniqueName: \"kubernetes.io/projected/580b8861-16eb-4142-bd61-6d0221a07f4d-kube-api-access-f5q2b\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.544325 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/580b8861-16eb-4142-bd61-6d0221a07f4d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.544365 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/580b8861-16eb-4142-bd61-6d0221a07f4d-config-data\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.544388 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/580b8861-16eb-4142-bd61-6d0221a07f4d-scripts\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.544489 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/580b8861-16eb-4142-bd61-6d0221a07f4d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.549751 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/580b8861-16eb-4142-bd61-6d0221a07f4d-scripts\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.550393 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/580b8861-16eb-4142-bd61-6d0221a07f4d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.551442 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/580b8861-16eb-4142-bd61-6d0221a07f4d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.553480 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/580b8861-16eb-4142-bd61-6d0221a07f4d-config-data\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.579830 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5q2b\" (UniqueName: \"kubernetes.io/projected/580b8861-16eb-4142-bd61-6d0221a07f4d-kube-api-access-f5q2b\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.725282 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 12:07:34 crc kubenswrapper[4837]: I0313 12:07:34.180067 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:07:34 crc kubenswrapper[4837]: I0313 12:07:34.341385 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"580b8861-16eb-4142-bd61-6d0221a07f4d","Type":"ContainerStarted","Data":"5f16e4151a5c5a1d5e3bbbbef9382f3b0bfdb6f80b756732f6067a78ca15814c"} Mar 13 12:07:35 crc kubenswrapper[4837]: I0313 12:07:35.061594 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de6b1e01-3054-46d9-b2f3-a8f3a7e504af" path="/var/lib/kubelet/pods/de6b1e01-3054-46d9-b2f3-a8f3a7e504af/volumes" Mar 13 12:07:35 crc kubenswrapper[4837]: I0313 12:07:35.366782 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"580b8861-16eb-4142-bd61-6d0221a07f4d","Type":"ContainerStarted","Data":"70a8b3af953d1a1c3f624af85221c7d64ee1bd28cc05308446e5cef8cdae4234"} Mar 13 12:07:35 crc kubenswrapper[4837]: I0313 12:07:35.483579 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:07:35 crc kubenswrapper[4837]: I0313 12:07:35.483676 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:07:36 crc kubenswrapper[4837]: I0313 12:07:36.155536 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5596f9dfb8-m9bxb" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 13 12:07:36 crc kubenswrapper[4837]: I0313 12:07:36.377997 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"580b8861-16eb-4142-bd61-6d0221a07f4d","Type":"ContainerStarted","Data":"10e53747353169370f11883e6aece96dc7e6854b97f5b0d0c3342f5f1fc98d51"} Mar 13 12:07:36 crc kubenswrapper[4837]: I0313 12:07:36.382518 4837 generic.go:334] "Generic (PLEG): container finished" podID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerID="7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb" exitCode=0 Mar 13 12:07:36 crc kubenswrapper[4837]: I0313 12:07:36.382582 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5596f9dfb8-m9bxb" event={"ID":"2a28d7a5-22a2-460a-a08c-8eb484e6c382","Type":"ContainerDied","Data":"7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb"} Mar 13 12:07:36 crc kubenswrapper[4837]: I0313 12:07:36.417298 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.417270206 podStartE2EDuration="3.417270206s" podCreationTimestamp="2026-03-13 12:07:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:36.415603554 +0000 UTC m=+1172.053870317" watchObservedRunningTime="2026-03-13 12:07:36.417270206 +0000 UTC m=+1172.055536969" Mar 13 12:07:36 crc kubenswrapper[4837]: I0313 12:07:36.779113 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:36 crc kubenswrapper[4837]: I0313 12:07:36.929203 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:37 crc kubenswrapper[4837]: I0313 12:07:37.025752 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7598d89cd4-qfmh9"] Mar 13 12:07:37 crc kubenswrapper[4837]: I0313 12:07:37.026426 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7598d89cd4-qfmh9" podUID="91206ea2-5d2b-478d-983e-6c842f02819b" containerName="barbican-api-log" containerID="cri-o://104e38d91432a24be429666c7aef47a48dc5e37624f7f42d829e3d5a83308ad5" gracePeriod=30 Mar 13 12:07:37 crc kubenswrapper[4837]: I0313 12:07:37.026576 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7598d89cd4-qfmh9" podUID="91206ea2-5d2b-478d-983e-6c842f02819b" containerName="barbican-api" containerID="cri-o://f929d8442913bfaecc7956e8f7c394bf2287e01e6f101666b06a41edc759a582" gracePeriod=30 Mar 13 12:07:37 crc kubenswrapper[4837]: I0313 12:07:37.395215 4837 generic.go:334] "Generic (PLEG): container finished" podID="91206ea2-5d2b-478d-983e-6c842f02819b" containerID="104e38d91432a24be429666c7aef47a48dc5e37624f7f42d829e3d5a83308ad5" exitCode=143 Mar 13 12:07:37 crc kubenswrapper[4837]: I0313 12:07:37.396217 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7598d89cd4-qfmh9" event={"ID":"91206ea2-5d2b-478d-983e-6c842f02819b","Type":"ContainerDied","Data":"104e38d91432a24be429666c7aef47a48dc5e37624f7f42d829e3d5a83308ad5"} Mar 13 12:07:38 crc kubenswrapper[4837]: I0313 12:07:38.727616 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 13 12:07:39 crc kubenswrapper[4837]: I0313 12:07:39.113424 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.196817 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7598d89cd4-qfmh9" podUID="91206ea2-5d2b-478d-983e-6c842f02819b" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:46590->10.217.0.166:9311: read: connection reset by peer" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.196843 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7598d89cd4-qfmh9" podUID="91206ea2-5d2b-478d-983e-6c842f02819b" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:46598->10.217.0.166:9311: read: connection reset by peer" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.421333 4837 generic.go:334] "Generic (PLEG): container finished" podID="91206ea2-5d2b-478d-983e-6c842f02819b" containerID="f929d8442913bfaecc7956e8f7c394bf2287e01e6f101666b06a41edc759a582" exitCode=0 Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.421375 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7598d89cd4-qfmh9" event={"ID":"91206ea2-5d2b-478d-983e-6c842f02819b","Type":"ContainerDied","Data":"f929d8442913bfaecc7956e8f7c394bf2287e01e6f101666b06a41edc759a582"} Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.623702 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.722339 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx5rf\" (UniqueName: \"kubernetes.io/projected/91206ea2-5d2b-478d-983e-6c842f02819b-kube-api-access-bx5rf\") pod \"91206ea2-5d2b-478d-983e-6c842f02819b\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.722386 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-config-data-custom\") pod \"91206ea2-5d2b-478d-983e-6c842f02819b\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.722444 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-config-data\") pod \"91206ea2-5d2b-478d-983e-6c842f02819b\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.722509 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-combined-ca-bundle\") pod \"91206ea2-5d2b-478d-983e-6c842f02819b\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.722686 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91206ea2-5d2b-478d-983e-6c842f02819b-logs\") pod \"91206ea2-5d2b-478d-983e-6c842f02819b\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.723737 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91206ea2-5d2b-478d-983e-6c842f02819b-logs" (OuterVolumeSpecName: "logs") pod "91206ea2-5d2b-478d-983e-6c842f02819b" (UID: "91206ea2-5d2b-478d-983e-6c842f02819b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.735097 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "91206ea2-5d2b-478d-983e-6c842f02819b" (UID: "91206ea2-5d2b-478d-983e-6c842f02819b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.746017 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91206ea2-5d2b-478d-983e-6c842f02819b-kube-api-access-bx5rf" (OuterVolumeSpecName: "kube-api-access-bx5rf") pod "91206ea2-5d2b-478d-983e-6c842f02819b" (UID: "91206ea2-5d2b-478d-983e-6c842f02819b"). InnerVolumeSpecName "kube-api-access-bx5rf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.826996 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91206ea2-5d2b-478d-983e-6c842f02819b-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.827048 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx5rf\" (UniqueName: \"kubernetes.io/projected/91206ea2-5d2b-478d-983e-6c842f02819b-kube-api-access-bx5rf\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.827063 4837 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.851851 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91206ea2-5d2b-478d-983e-6c842f02819b" (UID: "91206ea2-5d2b-478d-983e-6c842f02819b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.898868 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-config-data" (OuterVolumeSpecName: "config-data") pod "91206ea2-5d2b-478d-983e-6c842f02819b" (UID: "91206ea2-5d2b-478d-983e-6c842f02819b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.929372 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.929427 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:41 crc kubenswrapper[4837]: I0313 12:07:41.430600 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7598d89cd4-qfmh9" event={"ID":"91206ea2-5d2b-478d-983e-6c842f02819b","Type":"ContainerDied","Data":"e401c09fc39f0377fdb0e13cc3564c85b21b640ae75df7edda1290f89d0c1fda"} Mar 13 12:07:41 crc kubenswrapper[4837]: I0313 12:07:41.430668 4837 scope.go:117] "RemoveContainer" containerID="f929d8442913bfaecc7956e8f7c394bf2287e01e6f101666b06a41edc759a582" Mar 13 12:07:41 crc kubenswrapper[4837]: I0313 12:07:41.430673 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:41 crc kubenswrapper[4837]: I0313 12:07:41.451247 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7598d89cd4-qfmh9"] Mar 13 12:07:41 crc kubenswrapper[4837]: I0313 12:07:41.460284 4837 scope.go:117] "RemoveContainer" containerID="104e38d91432a24be429666c7aef47a48dc5e37624f7f42d829e3d5a83308ad5" Mar 13 12:07:41 crc kubenswrapper[4837]: I0313 12:07:41.461143 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7598d89cd4-qfmh9"] Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.813718 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 13 12:07:42 crc kubenswrapper[4837]: E0313 12:07:42.814321 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91206ea2-5d2b-478d-983e-6c842f02819b" containerName="barbican-api-log" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.814341 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="91206ea2-5d2b-478d-983e-6c842f02819b" containerName="barbican-api-log" Mar 13 12:07:42 crc kubenswrapper[4837]: E0313 12:07:42.814389 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91206ea2-5d2b-478d-983e-6c842f02819b" containerName="barbican-api" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.814399 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="91206ea2-5d2b-478d-983e-6c842f02819b" containerName="barbican-api" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.814718 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="91206ea2-5d2b-478d-983e-6c842f02819b" containerName="barbican-api" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.814750 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="91206ea2-5d2b-478d-983e-6c842f02819b" containerName="barbican-api-log" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.815776 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.818198 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.818535 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.818713 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-x4nfx" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.826497 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.877820 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c53a4cf-579b-49dd-88e3-cee64443611e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " pod="openstack/openstackclient" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.877887 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgnqf\" (UniqueName: \"kubernetes.io/projected/3c53a4cf-579b-49dd-88e3-cee64443611e-kube-api-access-cgnqf\") pod \"openstackclient\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " pod="openstack/openstackclient" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.877975 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c53a4cf-579b-49dd-88e3-cee64443611e-openstack-config-secret\") pod \"openstackclient\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " pod="openstack/openstackclient" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.878014 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c53a4cf-579b-49dd-88e3-cee64443611e-openstack-config\") pod \"openstackclient\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " pod="openstack/openstackclient" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.979824 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c53a4cf-579b-49dd-88e3-cee64443611e-openstack-config\") pod \"openstackclient\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " pod="openstack/openstackclient" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.980138 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c53a4cf-579b-49dd-88e3-cee64443611e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " pod="openstack/openstackclient" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.980250 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgnqf\" (UniqueName: \"kubernetes.io/projected/3c53a4cf-579b-49dd-88e3-cee64443611e-kube-api-access-cgnqf\") pod \"openstackclient\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " pod="openstack/openstackclient" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.980377 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c53a4cf-579b-49dd-88e3-cee64443611e-openstack-config-secret\") pod \"openstackclient\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " pod="openstack/openstackclient" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.980993 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c53a4cf-579b-49dd-88e3-cee64443611e-openstack-config\") pod \"openstackclient\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.000507 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c53a4cf-579b-49dd-88e3-cee64443611e-openstack-config-secret\") pod \"openstackclient\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.001824 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c53a4cf-579b-49dd-88e3-cee64443611e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.003410 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgnqf\" (UniqueName: \"kubernetes.io/projected/3c53a4cf-579b-49dd-88e3-cee64443611e-kube-api-access-cgnqf\") pod \"openstackclient\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.059241 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91206ea2-5d2b-478d-983e-6c842f02819b" path="/var/lib/kubelet/pods/91206ea2-5d2b-478d-983e-6c842f02819b/volumes" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.150279 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.267368 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.279700 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.298148 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.299359 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.308786 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.389747 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrnfz\" (UniqueName: \"kubernetes.io/projected/5d15c820-a2ee-4d4c-986f-2c2f09b43f79-kube-api-access-mrnfz\") pod \"openstackclient\" (UID: \"5d15c820-a2ee-4d4c-986f-2c2f09b43f79\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.390144 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d15c820-a2ee-4d4c-986f-2c2f09b43f79-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5d15c820-a2ee-4d4c-986f-2c2f09b43f79\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.390237 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5d15c820-a2ee-4d4c-986f-2c2f09b43f79-openstack-config-secret\") pod \"openstackclient\" (UID: \"5d15c820-a2ee-4d4c-986f-2c2f09b43f79\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.390294 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5d15c820-a2ee-4d4c-986f-2c2f09b43f79-openstack-config\") pod \"openstackclient\" (UID: \"5d15c820-a2ee-4d4c-986f-2c2f09b43f79\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.492823 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5d15c820-a2ee-4d4c-986f-2c2f09b43f79-openstack-config\") pod \"openstackclient\" (UID: \"5d15c820-a2ee-4d4c-986f-2c2f09b43f79\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.492978 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrnfz\" (UniqueName: \"kubernetes.io/projected/5d15c820-a2ee-4d4c-986f-2c2f09b43f79-kube-api-access-mrnfz\") pod \"openstackclient\" (UID: \"5d15c820-a2ee-4d4c-986f-2c2f09b43f79\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.493015 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d15c820-a2ee-4d4c-986f-2c2f09b43f79-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5d15c820-a2ee-4d4c-986f-2c2f09b43f79\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.493074 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5d15c820-a2ee-4d4c-986f-2c2f09b43f79-openstack-config-secret\") pod \"openstackclient\" (UID: \"5d15c820-a2ee-4d4c-986f-2c2f09b43f79\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.493820 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5d15c820-a2ee-4d4c-986f-2c2f09b43f79-openstack-config\") pod \"openstackclient\" (UID: \"5d15c820-a2ee-4d4c-986f-2c2f09b43f79\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.500462 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5d15c820-a2ee-4d4c-986f-2c2f09b43f79-openstack-config-secret\") pod \"openstackclient\" (UID: \"5d15c820-a2ee-4d4c-986f-2c2f09b43f79\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.500549 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d15c820-a2ee-4d4c-986f-2c2f09b43f79-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5d15c820-a2ee-4d4c-986f-2c2f09b43f79\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.514810 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrnfz\" (UniqueName: \"kubernetes.io/projected/5d15c820-a2ee-4d4c-986f-2c2f09b43f79-kube-api-access-mrnfz\") pod \"openstackclient\" (UID: \"5d15c820-a2ee-4d4c-986f-2c2f09b43f79\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.630097 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: E0313 12:07:43.677799 4837 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 12:07:43 crc kubenswrapper[4837]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_3c53a4cf-579b-49dd-88e3-cee64443611e_0(e06daad3c7287324af1d17357c43cce41d497603c6e8a06dedc6a1df4754a4f2): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"e06daad3c7287324af1d17357c43cce41d497603c6e8a06dedc6a1df4754a4f2" Netns:"/var/run/netns/2bd1c5d9-89fa-4eca-a7d7-beffdd102d90" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=e06daad3c7287324af1d17357c43cce41d497603c6e8a06dedc6a1df4754a4f2;K8S_POD_UID=3c53a4cf-579b-49dd-88e3-cee64443611e" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: [openstack/openstackclient/3c53a4cf-579b-49dd-88e3-cee64443611e:ovn-kubernetes]: error adding container to network "ovn-kubernetes": CNI request failed with status 400: '[openstack/openstackclient e06daad3c7287324af1d17357c43cce41d497603c6e8a06dedc6a1df4754a4f2 network default NAD default] [openstack/openstackclient e06daad3c7287324af1d17357c43cce41d497603c6e8a06dedc6a1df4754a4f2 network default NAD default] failed to configure pod interface: canceled old pod sandbox waiting for OVS port binding for 0a:58:0a:d9:00:ae [10.217.0.174/23] Mar 13 12:07:43 crc kubenswrapper[4837]: ' Mar 13 12:07:43 crc kubenswrapper[4837]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 12:07:43 crc kubenswrapper[4837]: > Mar 13 12:07:43 crc kubenswrapper[4837]: E0313 12:07:43.677931 4837 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 12:07:43 crc kubenswrapper[4837]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_3c53a4cf-579b-49dd-88e3-cee64443611e_0(e06daad3c7287324af1d17357c43cce41d497603c6e8a06dedc6a1df4754a4f2): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"e06daad3c7287324af1d17357c43cce41d497603c6e8a06dedc6a1df4754a4f2" Netns:"/var/run/netns/2bd1c5d9-89fa-4eca-a7d7-beffdd102d90" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=e06daad3c7287324af1d17357c43cce41d497603c6e8a06dedc6a1df4754a4f2;K8S_POD_UID=3c53a4cf-579b-49dd-88e3-cee64443611e" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: [openstack/openstackclient/3c53a4cf-579b-49dd-88e3-cee64443611e:ovn-kubernetes]: error adding container to network "ovn-kubernetes": CNI request failed with status 400: '[openstack/openstackclient e06daad3c7287324af1d17357c43cce41d497603c6e8a06dedc6a1df4754a4f2 network default NAD default] [openstack/openstackclient e06daad3c7287324af1d17357c43cce41d497603c6e8a06dedc6a1df4754a4f2 network default NAD default] failed to configure pod interface: canceled old pod sandbox waiting for OVS port binding for 0a:58:0a:d9:00:ae [10.217.0.174/23] Mar 13 12:07:43 crc kubenswrapper[4837]: ' Mar 13 12:07:43 crc kubenswrapper[4837]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 12:07:43 crc kubenswrapper[4837]: > pod="openstack/openstackclient" Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.035448 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.105746 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.474556 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5d15c820-a2ee-4d4c-986f-2c2f09b43f79","Type":"ContainerStarted","Data":"f21ec75b73478c7d1fa58479d3e978e46604c8a48f0a1340b54b15d8f3a3caaf"} Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.474585 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.479706 4837 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="3c53a4cf-579b-49dd-88e3-cee64443611e" podUID="5d15c820-a2ee-4d4c-986f-2c2f09b43f79" Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.487578 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.511748 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgnqf\" (UniqueName: \"kubernetes.io/projected/3c53a4cf-579b-49dd-88e3-cee64443611e-kube-api-access-cgnqf\") pod \"3c53a4cf-579b-49dd-88e3-cee64443611e\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.511807 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c53a4cf-579b-49dd-88e3-cee64443611e-combined-ca-bundle\") pod \"3c53a4cf-579b-49dd-88e3-cee64443611e\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.512019 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c53a4cf-579b-49dd-88e3-cee64443611e-openstack-config-secret\") pod \"3c53a4cf-579b-49dd-88e3-cee64443611e\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.512092 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c53a4cf-579b-49dd-88e3-cee64443611e-openstack-config\") pod \"3c53a4cf-579b-49dd-88e3-cee64443611e\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.512734 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c53a4cf-579b-49dd-88e3-cee64443611e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "3c53a4cf-579b-49dd-88e3-cee64443611e" (UID: "3c53a4cf-579b-49dd-88e3-cee64443611e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.517950 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c53a4cf-579b-49dd-88e3-cee64443611e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "3c53a4cf-579b-49dd-88e3-cee64443611e" (UID: "3c53a4cf-579b-49dd-88e3-cee64443611e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.518113 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c53a4cf-579b-49dd-88e3-cee64443611e-kube-api-access-cgnqf" (OuterVolumeSpecName: "kube-api-access-cgnqf") pod "3c53a4cf-579b-49dd-88e3-cee64443611e" (UID: "3c53a4cf-579b-49dd-88e3-cee64443611e"). InnerVolumeSpecName "kube-api-access-cgnqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.518174 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c53a4cf-579b-49dd-88e3-cee64443611e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c53a4cf-579b-49dd-88e3-cee64443611e" (UID: "3c53a4cf-579b-49dd-88e3-cee64443611e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.614752 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgnqf\" (UniqueName: \"kubernetes.io/projected/3c53a4cf-579b-49dd-88e3-cee64443611e-kube-api-access-cgnqf\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.615105 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c53a4cf-579b-49dd-88e3-cee64443611e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.615118 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c53a4cf-579b-49dd-88e3-cee64443611e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.615133 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c53a4cf-579b-49dd-88e3-cee64443611e-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:45 crc kubenswrapper[4837]: I0313 12:07:45.063436 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c53a4cf-579b-49dd-88e3-cee64443611e" path="/var/lib/kubelet/pods/3c53a4cf-579b-49dd-88e3-cee64443611e/volumes" Mar 13 12:07:45 crc kubenswrapper[4837]: I0313 12:07:45.484437 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 12:07:45 crc kubenswrapper[4837]: I0313 12:07:45.490983 4837 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="3c53a4cf-579b-49dd-88e3-cee64443611e" podUID="5d15c820-a2ee-4d4c-986f-2c2f09b43f79" Mar 13 12:07:46 crc kubenswrapper[4837]: I0313 12:07:46.155941 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5596f9dfb8-m9bxb" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.467118 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-bfbc874dc-vsh7q"] Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.469302 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.471749 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.471787 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.481320 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-bfbc874dc-vsh7q"] Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.484088 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.581195 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/36ffa543-526d-4d56-b599-06fcfe0988cf-log-httpd\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.581294 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7474p\" (UniqueName: \"kubernetes.io/projected/36ffa543-526d-4d56-b599-06fcfe0988cf-kube-api-access-7474p\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.581330 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/36ffa543-526d-4d56-b599-06fcfe0988cf-etc-swift\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.581357 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36ffa543-526d-4d56-b599-06fcfe0988cf-public-tls-certs\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.581401 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/36ffa543-526d-4d56-b599-06fcfe0988cf-run-httpd\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.581452 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36ffa543-526d-4d56-b599-06fcfe0988cf-internal-tls-certs\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.581486 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ffa543-526d-4d56-b599-06fcfe0988cf-config-data\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.581520 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ffa543-526d-4d56-b599-06fcfe0988cf-combined-ca-bundle\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.682960 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36ffa543-526d-4d56-b599-06fcfe0988cf-internal-tls-certs\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.683023 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ffa543-526d-4d56-b599-06fcfe0988cf-config-data\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.683064 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ffa543-526d-4d56-b599-06fcfe0988cf-combined-ca-bundle\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.683138 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/36ffa543-526d-4d56-b599-06fcfe0988cf-log-httpd\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.683168 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7474p\" (UniqueName: \"kubernetes.io/projected/36ffa543-526d-4d56-b599-06fcfe0988cf-kube-api-access-7474p\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.683191 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/36ffa543-526d-4d56-b599-06fcfe0988cf-etc-swift\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.683212 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36ffa543-526d-4d56-b599-06fcfe0988cf-public-tls-certs\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.683243 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/36ffa543-526d-4d56-b599-06fcfe0988cf-run-httpd\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.683873 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/36ffa543-526d-4d56-b599-06fcfe0988cf-run-httpd\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.684392 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/36ffa543-526d-4d56-b599-06fcfe0988cf-log-httpd\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.692662 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36ffa543-526d-4d56-b599-06fcfe0988cf-public-tls-certs\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.692673 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36ffa543-526d-4d56-b599-06fcfe0988cf-internal-tls-certs\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.692983 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ffa543-526d-4d56-b599-06fcfe0988cf-config-data\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.693516 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/36ffa543-526d-4d56-b599-06fcfe0988cf-etc-swift\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.696489 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ffa543-526d-4d56-b599-06fcfe0988cf-combined-ca-bundle\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.706404 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7474p\" (UniqueName: \"kubernetes.io/projected/36ffa543-526d-4d56-b599-06fcfe0988cf-kube-api-access-7474p\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.789616 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:48 crc kubenswrapper[4837]: I0313 12:07:48.354948 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-bfbc874dc-vsh7q"] Mar 13 12:07:49 crc kubenswrapper[4837]: I0313 12:07:49.043516 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:49 crc kubenswrapper[4837]: I0313 12:07:49.045145 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:49 crc kubenswrapper[4837]: I0313 12:07:49.140931 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:07:49 crc kubenswrapper[4837]: I0313 12:07:49.141224 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="ceilometer-central-agent" containerID="cri-o://f106579d1eb92efbafe377b2c5e41ffb980fcd44573e4b8ba73109499680b552" gracePeriod=30 Mar 13 12:07:49 crc kubenswrapper[4837]: I0313 12:07:49.141282 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="proxy-httpd" containerID="cri-o://317a9b585064c8564bcd2ae43ba40834f6b1cd25a3d83d32f956502b5b280276" gracePeriod=30 Mar 13 12:07:49 crc kubenswrapper[4837]: I0313 12:07:49.141345 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="sg-core" containerID="cri-o://7d867af8873a7e0421fd52164c9458573cf6ac8847b38f845cf622f104ceb41b" gracePeriod=30 Mar 13 12:07:49 crc kubenswrapper[4837]: I0313 12:07:49.141411 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="ceilometer-notification-agent" containerID="cri-o://343420b862af8b30fbf01c83c65d52d9d2faba010cdc819f2b823e8a9b058006" gracePeriod=30 Mar 13 12:07:49 crc kubenswrapper[4837]: I0313 12:07:49.172874 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 13 12:07:49 crc kubenswrapper[4837]: I0313 12:07:49.544044 4837 generic.go:334] "Generic (PLEG): container finished" podID="8944c2be-da67-4cdd-9f75-0e473253e932" containerID="317a9b585064c8564bcd2ae43ba40834f6b1cd25a3d83d32f956502b5b280276" exitCode=0 Mar 13 12:07:49 crc kubenswrapper[4837]: I0313 12:07:49.544085 4837 generic.go:334] "Generic (PLEG): container finished" podID="8944c2be-da67-4cdd-9f75-0e473253e932" containerID="7d867af8873a7e0421fd52164c9458573cf6ac8847b38f845cf622f104ceb41b" exitCode=2 Mar 13 12:07:49 crc kubenswrapper[4837]: I0313 12:07:49.545013 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8944c2be-da67-4cdd-9f75-0e473253e932","Type":"ContainerDied","Data":"317a9b585064c8564bcd2ae43ba40834f6b1cd25a3d83d32f956502b5b280276"} Mar 13 12:07:49 crc kubenswrapper[4837]: I0313 12:07:49.545052 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8944c2be-da67-4cdd-9f75-0e473253e932","Type":"ContainerDied","Data":"7d867af8873a7e0421fd52164c9458573cf6ac8847b38f845cf622f104ceb41b"} Mar 13 12:07:50 crc kubenswrapper[4837]: I0313 12:07:50.563389 4837 generic.go:334] "Generic (PLEG): container finished" podID="8944c2be-da67-4cdd-9f75-0e473253e932" containerID="343420b862af8b30fbf01c83c65d52d9d2faba010cdc819f2b823e8a9b058006" exitCode=0 Mar 13 12:07:50 crc kubenswrapper[4837]: I0313 12:07:50.563742 4837 generic.go:334] "Generic (PLEG): container finished" podID="8944c2be-da67-4cdd-9f75-0e473253e932" containerID="f106579d1eb92efbafe377b2c5e41ffb980fcd44573e4b8ba73109499680b552" exitCode=0 Mar 13 12:07:50 crc kubenswrapper[4837]: I0313 12:07:50.563455 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8944c2be-da67-4cdd-9f75-0e473253e932","Type":"ContainerDied","Data":"343420b862af8b30fbf01c83c65d52d9d2faba010cdc819f2b823e8a9b058006"} Mar 13 12:07:50 crc kubenswrapper[4837]: I0313 12:07:50.563787 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8944c2be-da67-4cdd-9f75-0e473253e932","Type":"ContainerDied","Data":"f106579d1eb92efbafe377b2c5e41ffb980fcd44573e4b8ba73109499680b552"} Mar 13 12:07:52 crc kubenswrapper[4837]: I0313 12:07:52.339338 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.170:3000/\": dial tcp 10.217.0.170:3000: connect: connection refused" Mar 13 12:07:53 crc kubenswrapper[4837]: W0313 12:07:53.939750 4837 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8944c2be_da67_4cdd_9f75_0e473253e932.slice/crio-343420b862af8b30fbf01c83c65d52d9d2faba010cdc819f2b823e8a9b058006.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8944c2be_da67_4cdd_9f75_0e473253e932.slice/crio-343420b862af8b30fbf01c83c65d52d9d2faba010cdc819f2b823e8a9b058006.scope: no such file or directory Mar 13 12:07:53 crc kubenswrapper[4837]: W0313 12:07:53.944871 4837 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8944c2be_da67_4cdd_9f75_0e473253e932.slice/crio-conmon-7d867af8873a7e0421fd52164c9458573cf6ac8847b38f845cf622f104ceb41b.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8944c2be_da67_4cdd_9f75_0e473253e932.slice/crio-conmon-7d867af8873a7e0421fd52164c9458573cf6ac8847b38f845cf622f104ceb41b.scope: no such file or directory Mar 13 12:07:53 crc kubenswrapper[4837]: W0313 12:07:53.944932 4837 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8944c2be_da67_4cdd_9f75_0e473253e932.slice/crio-7d867af8873a7e0421fd52164c9458573cf6ac8847b38f845cf622f104ceb41b.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8944c2be_da67_4cdd_9f75_0e473253e932.slice/crio-7d867af8873a7e0421fd52164c9458573cf6ac8847b38f845cf622f104ceb41b.scope: no such file or directory Mar 13 12:07:53 crc kubenswrapper[4837]: W0313 12:07:53.945002 4837 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8944c2be_da67_4cdd_9f75_0e473253e932.slice/crio-conmon-317a9b585064c8564bcd2ae43ba40834f6b1cd25a3d83d32f956502b5b280276.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8944c2be_da67_4cdd_9f75_0e473253e932.slice/crio-conmon-317a9b585064c8564bcd2ae43ba40834f6b1cd25a3d83d32f956502b5b280276.scope: no such file or directory Mar 13 12:07:53 crc kubenswrapper[4837]: W0313 12:07:53.945022 4837 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8944c2be_da67_4cdd_9f75_0e473253e932.slice/crio-317a9b585064c8564bcd2ae43ba40834f6b1cd25a3d83d32f956502b5b280276.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8944c2be_da67_4cdd_9f75_0e473253e932.slice/crio-317a9b585064c8564bcd2ae43ba40834f6b1cd25a3d83d32f956502b5b280276.scope: no such file or directory Mar 13 12:07:53 crc kubenswrapper[4837]: W0313 12:07:53.960962 4837 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c53a4cf_579b_49dd_88e3_cee64443611e.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c53a4cf_579b_49dd_88e3_cee64443611e.slice: no such file or directory Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.445297 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.514932 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.619744 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz8t7\" (UniqueName: \"kubernetes.io/projected/8944c2be-da67-4cdd-9f75-0e473253e932-kube-api-access-cz8t7\") pod \"8944c2be-da67-4cdd-9f75-0e473253e932\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.619820 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-combined-ca-bundle\") pod \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.619859 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8944c2be-da67-4cdd-9f75-0e473253e932-run-httpd\") pod \"8944c2be-da67-4cdd-9f75-0e473253e932\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.619887 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8944c2be-da67-4cdd-9f75-0e473253e932-log-httpd\") pod \"8944c2be-da67-4cdd-9f75-0e473253e932\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.619957 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-scripts\") pod \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.619983 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-sg-core-conf-yaml\") pod \"8944c2be-da67-4cdd-9f75-0e473253e932\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.620024 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f484085-7b83-46a8-80c2-b3ef6f8b8798-logs\") pod \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.620049 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-config-data\") pod \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.620080 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-config-data\") pod \"8944c2be-da67-4cdd-9f75-0e473253e932\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.620129 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f484085-7b83-46a8-80c2-b3ef6f8b8798-etc-machine-id\") pod \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.620155 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-combined-ca-bundle\") pod \"8944c2be-da67-4cdd-9f75-0e473253e932\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.620216 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppxfm\" (UniqueName: \"kubernetes.io/projected/6f484085-7b83-46a8-80c2-b3ef6f8b8798-kube-api-access-ppxfm\") pod \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.620262 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-scripts\") pod \"8944c2be-da67-4cdd-9f75-0e473253e932\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.620283 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-config-data-custom\") pod \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.620831 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8944c2be-da67-4cdd-9f75-0e473253e932-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8944c2be-da67-4cdd-9f75-0e473253e932" (UID: "8944c2be-da67-4cdd-9f75-0e473253e932"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.627786 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8944c2be-da67-4cdd-9f75-0e473253e932-kube-api-access-cz8t7" (OuterVolumeSpecName: "kube-api-access-cz8t7") pod "8944c2be-da67-4cdd-9f75-0e473253e932" (UID: "8944c2be-da67-4cdd-9f75-0e473253e932"). InnerVolumeSpecName "kube-api-access-cz8t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.628743 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8944c2be-da67-4cdd-9f75-0e473253e932-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.629089 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6f484085-7b83-46a8-80c2-b3ef6f8b8798" (UID: "6f484085-7b83-46a8-80c2-b3ef6f8b8798"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.630578 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f484085-7b83-46a8-80c2-b3ef6f8b8798-logs" (OuterVolumeSpecName: "logs") pod "6f484085-7b83-46a8-80c2-b3ef6f8b8798" (UID: "6f484085-7b83-46a8-80c2-b3ef6f8b8798"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.630672 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f484085-7b83-46a8-80c2-b3ef6f8b8798-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6f484085-7b83-46a8-80c2-b3ef6f8b8798" (UID: "6f484085-7b83-46a8-80c2-b3ef6f8b8798"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.631094 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8944c2be-da67-4cdd-9f75-0e473253e932-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8944c2be-da67-4cdd-9f75-0e473253e932" (UID: "8944c2be-da67-4cdd-9f75-0e473253e932"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.631759 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-scripts" (OuterVolumeSpecName: "scripts") pod "6f484085-7b83-46a8-80c2-b3ef6f8b8798" (UID: "6f484085-7b83-46a8-80c2-b3ef6f8b8798"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.634115 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-bfbc874dc-vsh7q" event={"ID":"36ffa543-526d-4d56-b599-06fcfe0988cf","Type":"ContainerStarted","Data":"7f0dc730f5a20cd902650f8ce857c7f2a912d9c2636f815700cba37775ba2724"} Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.634165 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-bfbc874dc-vsh7q" event={"ID":"36ffa543-526d-4d56-b599-06fcfe0988cf","Type":"ContainerStarted","Data":"9361990b64ae4a8f16092212426b5d26a1198cd3e68259a624dc137146c20a4c"} Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.642506 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f484085-7b83-46a8-80c2-b3ef6f8b8798-kube-api-access-ppxfm" (OuterVolumeSpecName: "kube-api-access-ppxfm") pod "6f484085-7b83-46a8-80c2-b3ef6f8b8798" (UID: "6f484085-7b83-46a8-80c2-b3ef6f8b8798"). InnerVolumeSpecName "kube-api-access-ppxfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.644369 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5d15c820-a2ee-4d4c-986f-2c2f09b43f79","Type":"ContainerStarted","Data":"b8ad12b2d30d012686daaa741a37ef92d03c205452f05b4aae0977eeca475799"} Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.646800 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-scripts" (OuterVolumeSpecName: "scripts") pod "8944c2be-da67-4cdd-9f75-0e473253e932" (UID: "8944c2be-da67-4cdd-9f75-0e473253e932"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.663056 4837 generic.go:334] "Generic (PLEG): container finished" podID="6f484085-7b83-46a8-80c2-b3ef6f8b8798" containerID="4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d" exitCode=137 Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.663194 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6f484085-7b83-46a8-80c2-b3ef6f8b8798","Type":"ContainerDied","Data":"4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d"} Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.663225 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6f484085-7b83-46a8-80c2-b3ef6f8b8798","Type":"ContainerDied","Data":"2608f88642291363e7163567a42948a0027f5da1a879663defaa6a0c943729b9"} Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.663243 4837 scope.go:117] "RemoveContainer" containerID="4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.663195 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.668673 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.477881006 podStartE2EDuration="11.668630542s" podCreationTimestamp="2026-03-13 12:07:43 +0000 UTC" firstStartedPulling="2026-03-13 12:07:44.110366657 +0000 UTC m=+1179.748633440" lastFinishedPulling="2026-03-13 12:07:54.301116213 +0000 UTC m=+1189.939382976" observedRunningTime="2026-03-13 12:07:54.663292914 +0000 UTC m=+1190.301559697" watchObservedRunningTime="2026-03-13 12:07:54.668630542 +0000 UTC m=+1190.306897325" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.672787 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f484085-7b83-46a8-80c2-b3ef6f8b8798" (UID: "6f484085-7b83-46a8-80c2-b3ef6f8b8798"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.678730 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8944c2be-da67-4cdd-9f75-0e473253e932","Type":"ContainerDied","Data":"affa40a245268506c6f6766fb2f158d46986fd7f106dd4cfb003b265c6f1faa4"} Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.679078 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.681692 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8944c2be-da67-4cdd-9f75-0e473253e932" (UID: "8944c2be-da67-4cdd-9f75-0e473253e932"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.700027 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-config-data" (OuterVolumeSpecName: "config-data") pod "6f484085-7b83-46a8-80c2-b3ef6f8b8798" (UID: "6f484085-7b83-46a8-80c2-b3ef6f8b8798"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.700232 4837 scope.go:117] "RemoveContainer" containerID="bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.729810 4837 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f484085-7b83-46a8-80c2-b3ef6f8b8798-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.729847 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppxfm\" (UniqueName: \"kubernetes.io/projected/6f484085-7b83-46a8-80c2-b3ef6f8b8798-kube-api-access-ppxfm\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.729861 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.729872 4837 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.729884 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz8t7\" (UniqueName: \"kubernetes.io/projected/8944c2be-da67-4cdd-9f75-0e473253e932-kube-api-access-cz8t7\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.729897 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.729909 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8944c2be-da67-4cdd-9f75-0e473253e932-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.729921 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.729932 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.729945 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f484085-7b83-46a8-80c2-b3ef6f8b8798-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.729956 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.736218 4837 scope.go:117] "RemoveContainer" containerID="4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.736468 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8944c2be-da67-4cdd-9f75-0e473253e932" (UID: "8944c2be-da67-4cdd-9f75-0e473253e932"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: E0313 12:07:54.736782 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d\": container with ID starting with 4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d not found: ID does not exist" containerID="4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.736816 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d"} err="failed to get container status \"4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d\": rpc error: code = NotFound desc = could not find container \"4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d\": container with ID starting with 4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d not found: ID does not exist" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.736841 4837 scope.go:117] "RemoveContainer" containerID="bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5" Mar 13 12:07:54 crc kubenswrapper[4837]: E0313 12:07:54.737130 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5\": container with ID starting with bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5 not found: ID does not exist" containerID="bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.737171 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5"} err="failed to get container status \"bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5\": rpc error: code = NotFound desc = could not find container \"bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5\": container with ID starting with bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5 not found: ID does not exist" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.737202 4837 scope.go:117] "RemoveContainer" containerID="317a9b585064c8564bcd2ae43ba40834f6b1cd25a3d83d32f956502b5b280276" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.749971 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-config-data" (OuterVolumeSpecName: "config-data") pod "8944c2be-da67-4cdd-9f75-0e473253e932" (UID: "8944c2be-da67-4cdd-9f75-0e473253e932"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.764172 4837 scope.go:117] "RemoveContainer" containerID="7d867af8873a7e0421fd52164c9458573cf6ac8847b38f845cf622f104ceb41b" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.792595 4837 scope.go:117] "RemoveContainer" containerID="343420b862af8b30fbf01c83c65d52d9d2faba010cdc819f2b823e8a9b058006" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.813592 4837 scope.go:117] "RemoveContainer" containerID="f106579d1eb92efbafe377b2c5e41ffb980fcd44573e4b8ba73109499680b552" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.830628 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.830679 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.999261 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.018600 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.043878 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.083968 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f484085-7b83-46a8-80c2-b3ef6f8b8798" path="/var/lib/kubelet/pods/6f484085-7b83-46a8-80c2-b3ef6f8b8798/volumes" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.095330 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:07:55 crc kubenswrapper[4837]: E0313 12:07:55.102682 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="ceilometer-central-agent" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.103186 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="ceilometer-central-agent" Mar 13 12:07:55 crc kubenswrapper[4837]: E0313 12:07:55.103369 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f484085-7b83-46a8-80c2-b3ef6f8b8798" containerName="cinder-api" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.103674 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f484085-7b83-46a8-80c2-b3ef6f8b8798" containerName="cinder-api" Mar 13 12:07:55 crc kubenswrapper[4837]: E0313 12:07:55.107081 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="proxy-httpd" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.107280 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="proxy-httpd" Mar 13 12:07:55 crc kubenswrapper[4837]: E0313 12:07:55.107399 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="sg-core" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.107471 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="sg-core" Mar 13 12:07:55 crc kubenswrapper[4837]: E0313 12:07:55.107581 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="ceilometer-notification-agent" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.107690 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="ceilometer-notification-agent" Mar 13 12:07:55 crc kubenswrapper[4837]: E0313 12:07:55.107797 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f484085-7b83-46a8-80c2-b3ef6f8b8798" containerName="cinder-api-log" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.107873 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f484085-7b83-46a8-80c2-b3ef6f8b8798" containerName="cinder-api-log" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.108378 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f484085-7b83-46a8-80c2-b3ef6f8b8798" containerName="cinder-api-log" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.108552 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="sg-core" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.108701 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f484085-7b83-46a8-80c2-b3ef6f8b8798" containerName="cinder-api" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.108801 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="ceilometer-notification-agent" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.111473 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="ceilometer-central-agent" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.111720 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="proxy-httpd" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.113155 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.118759 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.114985 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.120010 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.122468 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.122574 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.122723 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.126683 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.129077 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.129330 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.133851 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.138806 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8004928-50bc-4db8-a701-4458c42bc776-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.138842 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8004928-50bc-4db8-a701-4458c42bc776-logs\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.138879 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-config-data\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.138918 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rd84\" (UniqueName: \"kubernetes.io/projected/a8004928-50bc-4db8-a701-4458c42bc776-kube-api-access-8rd84\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.138935 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.138974 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-config-data-custom\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.138995 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec252a2a-f9a4-4894-991d-1a70f596519d-log-httpd\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.139011 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-scripts\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.139063 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-scripts\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.139083 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-config-data\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.139109 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec252a2a-f9a4-4894-991d-1a70f596519d-run-httpd\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.139122 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.139142 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.139158 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.139177 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.139195 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jmwm\" (UniqueName: \"kubernetes.io/projected/ec252a2a-f9a4-4894-991d-1a70f596519d-kube-api-access-9jmwm\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241248 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jmwm\" (UniqueName: \"kubernetes.io/projected/ec252a2a-f9a4-4894-991d-1a70f596519d-kube-api-access-9jmwm\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241321 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8004928-50bc-4db8-a701-4458c42bc776-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241362 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8004928-50bc-4db8-a701-4458c42bc776-logs\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241419 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-config-data\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241462 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rd84\" (UniqueName: \"kubernetes.io/projected/a8004928-50bc-4db8-a701-4458c42bc776-kube-api-access-8rd84\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241482 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241513 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-config-data-custom\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241539 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec252a2a-f9a4-4894-991d-1a70f596519d-log-httpd\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241559 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-scripts\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241610 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-scripts\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241628 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-config-data\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241694 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec252a2a-f9a4-4894-991d-1a70f596519d-run-httpd\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241712 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241750 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241776 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241810 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.242739 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8004928-50bc-4db8-a701-4458c42bc776-logs\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.242816 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8004928-50bc-4db8-a701-4458c42bc776-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.247589 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.248192 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-config-data\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.250040 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec252a2a-f9a4-4894-991d-1a70f596519d-run-httpd\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.250127 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-config-data-custom\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.250394 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec252a2a-f9a4-4894-991d-1a70f596519d-log-httpd\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.250938 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.251779 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-scripts\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.254179 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.254307 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-scripts\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.254402 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-config-data\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.260426 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.270244 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.271216 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jmwm\" (UniqueName: \"kubernetes.io/projected/ec252a2a-f9a4-4894-991d-1a70f596519d-kube-api-access-9jmwm\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.272174 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rd84\" (UniqueName: \"kubernetes.io/projected/a8004928-50bc-4db8-a701-4458c42bc776-kube-api-access-8rd84\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.457603 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.468620 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.711679 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-bfbc874dc-vsh7q" event={"ID":"36ffa543-526d-4d56-b599-06fcfe0988cf","Type":"ContainerStarted","Data":"fe5e27b6f150595d6723a12c077d23e29a6a968de2cd6f3c92d813816f07de44"} Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.712011 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.712180 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.722014 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.743319 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-bfbc874dc-vsh7q" podStartSLOduration=8.743300661 podStartE2EDuration="8.743300661s" podCreationTimestamp="2026-03-13 12:07:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:55.731170641 +0000 UTC m=+1191.369437404" watchObservedRunningTime="2026-03-13 12:07:55.743300661 +0000 UTC m=+1191.381567424" Mar 13 12:07:55 crc kubenswrapper[4837]: W0313 12:07:55.994451 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8004928_50bc_4db8_a701_4458c42bc776.slice/crio-25518f190eebe12e29d032768537d154d0c34493c273de162b69131ff26850ea WatchSource:0}: Error finding container 25518f190eebe12e29d032768537d154d0c34493c273de162b69131ff26850ea: Status 404 returned error can't find the container with id 25518f190eebe12e29d032768537d154d0c34493c273de162b69131ff26850ea Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.995919 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.107223 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-78jtc"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.108810 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-78jtc" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.130617 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-78jtc"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.145918 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.156562 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5596f9dfb8-m9bxb" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.156843 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.185743 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzhsn\" (UniqueName: \"kubernetes.io/projected/ff8550d6-aacb-4848-928d-b1581a66d499-kube-api-access-lzhsn\") pod \"nova-api-db-create-78jtc\" (UID: \"ff8550d6-aacb-4848-928d-b1581a66d499\") " pod="openstack/nova-api-db-create-78jtc" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.185799 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff8550d6-aacb-4848-928d-b1581a66d499-operator-scripts\") pod \"nova-api-db-create-78jtc\" (UID: \"ff8550d6-aacb-4848-928d-b1581a66d499\") " pod="openstack/nova-api-db-create-78jtc" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.204674 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-mqgjq"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.206154 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mqgjq" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.249462 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-8886-account-create-update-ljcrw"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.251075 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8886-account-create-update-ljcrw" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.253445 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.283077 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mqgjq"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.286885 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e51457d7-9619-4179-8f01-de6ffe5ceb82-operator-scripts\") pod \"nova-cell0-db-create-mqgjq\" (UID: \"e51457d7-9619-4179-8f01-de6ffe5ceb82\") " pod="openstack/nova-cell0-db-create-mqgjq" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.286942 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e397db42-b505-4447-87a2-4c12ed412f28-operator-scripts\") pod \"nova-api-8886-account-create-update-ljcrw\" (UID: \"e397db42-b505-4447-87a2-4c12ed412f28\") " pod="openstack/nova-api-8886-account-create-update-ljcrw" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.286971 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzhsn\" (UniqueName: \"kubernetes.io/projected/ff8550d6-aacb-4848-928d-b1581a66d499-kube-api-access-lzhsn\") pod \"nova-api-db-create-78jtc\" (UID: \"ff8550d6-aacb-4848-928d-b1581a66d499\") " pod="openstack/nova-api-db-create-78jtc" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.287002 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dmms\" (UniqueName: \"kubernetes.io/projected/e51457d7-9619-4179-8f01-de6ffe5ceb82-kube-api-access-9dmms\") pod \"nova-cell0-db-create-mqgjq\" (UID: \"e51457d7-9619-4179-8f01-de6ffe5ceb82\") " pod="openstack/nova-cell0-db-create-mqgjq" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.287025 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff8550d6-aacb-4848-928d-b1581a66d499-operator-scripts\") pod \"nova-api-db-create-78jtc\" (UID: \"ff8550d6-aacb-4848-928d-b1581a66d499\") " pod="openstack/nova-api-db-create-78jtc" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.287288 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8wqn\" (UniqueName: \"kubernetes.io/projected/e397db42-b505-4447-87a2-4c12ed412f28-kube-api-access-q8wqn\") pod \"nova-api-8886-account-create-update-ljcrw\" (UID: \"e397db42-b505-4447-87a2-4c12ed412f28\") " pod="openstack/nova-api-8886-account-create-update-ljcrw" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.287614 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff8550d6-aacb-4848-928d-b1581a66d499-operator-scripts\") pod \"nova-api-db-create-78jtc\" (UID: \"ff8550d6-aacb-4848-928d-b1581a66d499\") " pod="openstack/nova-api-db-create-78jtc" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.287737 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8886-account-create-update-ljcrw"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.312602 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzhsn\" (UniqueName: \"kubernetes.io/projected/ff8550d6-aacb-4848-928d-b1581a66d499-kube-api-access-lzhsn\") pod \"nova-api-db-create-78jtc\" (UID: \"ff8550d6-aacb-4848-928d-b1581a66d499\") " pod="openstack/nova-api-db-create-78jtc" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.389324 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e51457d7-9619-4179-8f01-de6ffe5ceb82-operator-scripts\") pod \"nova-cell0-db-create-mqgjq\" (UID: \"e51457d7-9619-4179-8f01-de6ffe5ceb82\") " pod="openstack/nova-cell0-db-create-mqgjq" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.389380 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e397db42-b505-4447-87a2-4c12ed412f28-operator-scripts\") pod \"nova-api-8886-account-create-update-ljcrw\" (UID: \"e397db42-b505-4447-87a2-4c12ed412f28\") " pod="openstack/nova-api-8886-account-create-update-ljcrw" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.389421 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dmms\" (UniqueName: \"kubernetes.io/projected/e51457d7-9619-4179-8f01-de6ffe5ceb82-kube-api-access-9dmms\") pod \"nova-cell0-db-create-mqgjq\" (UID: \"e51457d7-9619-4179-8f01-de6ffe5ceb82\") " pod="openstack/nova-cell0-db-create-mqgjq" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.389462 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8wqn\" (UniqueName: \"kubernetes.io/projected/e397db42-b505-4447-87a2-4c12ed412f28-kube-api-access-q8wqn\") pod \"nova-api-8886-account-create-update-ljcrw\" (UID: \"e397db42-b505-4447-87a2-4c12ed412f28\") " pod="openstack/nova-api-8886-account-create-update-ljcrw" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.390957 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e51457d7-9619-4179-8f01-de6ffe5ceb82-operator-scripts\") pod \"nova-cell0-db-create-mqgjq\" (UID: \"e51457d7-9619-4179-8f01-de6ffe5ceb82\") " pod="openstack/nova-cell0-db-create-mqgjq" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.391286 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e397db42-b505-4447-87a2-4c12ed412f28-operator-scripts\") pod \"nova-api-8886-account-create-update-ljcrw\" (UID: \"e397db42-b505-4447-87a2-4c12ed412f28\") " pod="openstack/nova-api-8886-account-create-update-ljcrw" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.400125 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-t8qk9"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.401582 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t8qk9" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.424556 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8wqn\" (UniqueName: \"kubernetes.io/projected/e397db42-b505-4447-87a2-4c12ed412f28-kube-api-access-q8wqn\") pod \"nova-api-8886-account-create-update-ljcrw\" (UID: \"e397db42-b505-4447-87a2-4c12ed412f28\") " pod="openstack/nova-api-8886-account-create-update-ljcrw" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.428981 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.431655 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dmms\" (UniqueName: \"kubernetes.io/projected/e51457d7-9619-4179-8f01-de6ffe5ceb82-kube-api-access-9dmms\") pod \"nova-cell0-db-create-mqgjq\" (UID: \"e51457d7-9619-4179-8f01-de6ffe5ceb82\") " pod="openstack/nova-cell0-db-create-mqgjq" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.453833 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-4581-account-create-update-w6tc2"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.455254 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4581-account-create-update-w6tc2" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.458776 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.460027 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-78jtc" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.467141 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-t8qk9"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.490851 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfjj7\" (UniqueName: \"kubernetes.io/projected/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb-kube-api-access-tfjj7\") pod \"nova-cell1-db-create-t8qk9\" (UID: \"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb\") " pod="openstack/nova-cell1-db-create-t8qk9" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.491031 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec46ef58-a8e9-4354-b9a1-568535879964-operator-scripts\") pod \"nova-cell0-4581-account-create-update-w6tc2\" (UID: \"ec46ef58-a8e9-4354-b9a1-568535879964\") " pod="openstack/nova-cell0-4581-account-create-update-w6tc2" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.491105 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcs7r\" (UniqueName: \"kubernetes.io/projected/ec46ef58-a8e9-4354-b9a1-568535879964-kube-api-access-bcs7r\") pod \"nova-cell0-4581-account-create-update-w6tc2\" (UID: \"ec46ef58-a8e9-4354-b9a1-568535879964\") " pod="openstack/nova-cell0-4581-account-create-update-w6tc2" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.491125 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb-operator-scripts\") pod \"nova-cell1-db-create-t8qk9\" (UID: \"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb\") " pod="openstack/nova-cell1-db-create-t8qk9" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.507769 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4581-account-create-update-w6tc2"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.538864 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-67f9f46cf4-9cvcg"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.539221 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-67f9f46cf4-9cvcg" podUID="073acab9-3b9b-432a-aef7-b59bad9fa6ea" containerName="neutron-api" containerID="cri-o://592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458" gracePeriod=30 Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.539856 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-67f9f46cf4-9cvcg" podUID="073acab9-3b9b-432a-aef7-b59bad9fa6ea" containerName="neutron-httpd" containerID="cri-o://5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f" gracePeriod=30 Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.565145 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mqgjq" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.595191 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8886-account-create-update-ljcrw" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.595504 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec46ef58-a8e9-4354-b9a1-568535879964-operator-scripts\") pod \"nova-cell0-4581-account-create-update-w6tc2\" (UID: \"ec46ef58-a8e9-4354-b9a1-568535879964\") " pod="openstack/nova-cell0-4581-account-create-update-w6tc2" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.595593 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcs7r\" (UniqueName: \"kubernetes.io/projected/ec46ef58-a8e9-4354-b9a1-568535879964-kube-api-access-bcs7r\") pod \"nova-cell0-4581-account-create-update-w6tc2\" (UID: \"ec46ef58-a8e9-4354-b9a1-568535879964\") " pod="openstack/nova-cell0-4581-account-create-update-w6tc2" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.595619 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb-operator-scripts\") pod \"nova-cell1-db-create-t8qk9\" (UID: \"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb\") " pod="openstack/nova-cell1-db-create-t8qk9" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.595660 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfjj7\" (UniqueName: \"kubernetes.io/projected/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb-kube-api-access-tfjj7\") pod \"nova-cell1-db-create-t8qk9\" (UID: \"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb\") " pod="openstack/nova-cell1-db-create-t8qk9" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.596513 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb-operator-scripts\") pod \"nova-cell1-db-create-t8qk9\" (UID: \"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb\") " pod="openstack/nova-cell1-db-create-t8qk9" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.596712 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec46ef58-a8e9-4354-b9a1-568535879964-operator-scripts\") pod \"nova-cell0-4581-account-create-update-w6tc2\" (UID: \"ec46ef58-a8e9-4354-b9a1-568535879964\") " pod="openstack/nova-cell0-4581-account-create-update-w6tc2" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.619386 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcs7r\" (UniqueName: \"kubernetes.io/projected/ec46ef58-a8e9-4354-b9a1-568535879964-kube-api-access-bcs7r\") pod \"nova-cell0-4581-account-create-update-w6tc2\" (UID: \"ec46ef58-a8e9-4354-b9a1-568535879964\") " pod="openstack/nova-cell0-4581-account-create-update-w6tc2" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.629286 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfjj7\" (UniqueName: \"kubernetes.io/projected/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb-kube-api-access-tfjj7\") pod \"nova-cell1-db-create-t8qk9\" (UID: \"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb\") " pod="openstack/nova-cell1-db-create-t8qk9" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.648715 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-c124-account-create-update-8zqgg"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.650290 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c124-account-create-update-8zqgg" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.657978 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.697290 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c124-account-create-update-8zqgg"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.700363 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ac843c1-9934-4711-aae6-7f6920596cb3-operator-scripts\") pod \"nova-cell1-c124-account-create-update-8zqgg\" (UID: \"6ac843c1-9934-4711-aae6-7f6920596cb3\") " pod="openstack/nova-cell1-c124-account-create-update-8zqgg" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.700427 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzzgd\" (UniqueName: \"kubernetes.io/projected/6ac843c1-9934-4711-aae6-7f6920596cb3-kube-api-access-nzzgd\") pod \"nova-cell1-c124-account-create-update-8zqgg\" (UID: \"6ac843c1-9934-4711-aae6-7f6920596cb3\") " pod="openstack/nova-cell1-c124-account-create-update-8zqgg" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.785596 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec252a2a-f9a4-4894-991d-1a70f596519d","Type":"ContainerStarted","Data":"8963d958bbfe2f25190f6d4efa0bcd7a6fe7c107dfdb4e163c3ec794ab189d07"} Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.789292 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t8qk9" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.789961 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4581-account-create-update-w6tc2" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.803713 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ac843c1-9934-4711-aae6-7f6920596cb3-operator-scripts\") pod \"nova-cell1-c124-account-create-update-8zqgg\" (UID: \"6ac843c1-9934-4711-aae6-7f6920596cb3\") " pod="openstack/nova-cell1-c124-account-create-update-8zqgg" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.803753 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzzgd\" (UniqueName: \"kubernetes.io/projected/6ac843c1-9934-4711-aae6-7f6920596cb3-kube-api-access-nzzgd\") pod \"nova-cell1-c124-account-create-update-8zqgg\" (UID: \"6ac843c1-9934-4711-aae6-7f6920596cb3\") " pod="openstack/nova-cell1-c124-account-create-update-8zqgg" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.804208 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a8004928-50bc-4db8-a701-4458c42bc776","Type":"ContainerStarted","Data":"25518f190eebe12e29d032768537d154d0c34493c273de162b69131ff26850ea"} Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.806977 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ac843c1-9934-4711-aae6-7f6920596cb3-operator-scripts\") pod \"nova-cell1-c124-account-create-update-8zqgg\" (UID: \"6ac843c1-9934-4711-aae6-7f6920596cb3\") " pod="openstack/nova-cell1-c124-account-create-update-8zqgg" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.823882 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzzgd\" (UniqueName: \"kubernetes.io/projected/6ac843c1-9934-4711-aae6-7f6920596cb3-kube-api-access-nzzgd\") pod \"nova-cell1-c124-account-create-update-8zqgg\" (UID: \"6ac843c1-9934-4711-aae6-7f6920596cb3\") " pod="openstack/nova-cell1-c124-account-create-update-8zqgg" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.848881 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.849114 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f0173ba9-535a-435d-bc51-75c069e69e46" containerName="glance-log" containerID="cri-o://5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482" gracePeriod=30 Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.849389 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f0173ba9-535a-435d-bc51-75c069e69e46" containerName="glance-httpd" containerID="cri-o://06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6" gracePeriod=30 Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.008814 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c124-account-create-update-8zqgg" Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.182865 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" path="/var/lib/kubelet/pods/8944c2be-da67-4cdd-9f75-0e473253e932/volumes" Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.183564 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-78jtc"] Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.801682 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mqgjq"] Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.818498 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-t8qk9"] Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.842405 4837 generic.go:334] "Generic (PLEG): container finished" podID="073acab9-3b9b-432a-aef7-b59bad9fa6ea" containerID="5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f" exitCode=0 Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.842528 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67f9f46cf4-9cvcg" event={"ID":"073acab9-3b9b-432a-aef7-b59bad9fa6ea","Type":"ContainerDied","Data":"5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f"} Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.845946 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a8004928-50bc-4db8-a701-4458c42bc776","Type":"ContainerStarted","Data":"4e476b792fcc3524b8f2bd0219d572eb68bfad23422ead86a6897878642bf878"} Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.847896 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-78jtc" event={"ID":"ff8550d6-aacb-4848-928d-b1581a66d499","Type":"ContainerStarted","Data":"1c35974102ee9d500e8bf603751d70cec13d07eed47dcd00ec2798fe9d358807"} Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.847930 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-78jtc" event={"ID":"ff8550d6-aacb-4848-928d-b1581a66d499","Type":"ContainerStarted","Data":"3a6b7d9266c6c68f69a431eb6b6c17756be54da729f94a82611fe08a1b1a72be"} Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.894543 4837 generic.go:334] "Generic (PLEG): container finished" podID="f0173ba9-535a-435d-bc51-75c069e69e46" containerID="5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482" exitCode=143 Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.894807 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0173ba9-535a-435d-bc51-75c069e69e46","Type":"ContainerDied","Data":"5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482"} Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.902385 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec252a2a-f9a4-4894-991d-1a70f596519d","Type":"ContainerStarted","Data":"f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a"} Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.974499 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-78jtc" podStartSLOduration=1.9744745510000001 podStartE2EDuration="1.974474551s" podCreationTimestamp="2026-03-13 12:07:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:57.88851587 +0000 UTC m=+1193.526782633" watchObservedRunningTime="2026-03-13 12:07:57.974474551 +0000 UTC m=+1193.612741314" Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.979277 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8886-account-create-update-ljcrw"] Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.004999 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4581-account-create-update-w6tc2"] Mar 13 12:07:58 crc kubenswrapper[4837]: W0313 12:07:58.180998 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ac843c1_9934_4711_aae6_7f6920596cb3.slice/crio-bd1db75a0f6b89871b87bcc95f04a2edb02439e89ab9b5e1a096304f7490db99 WatchSource:0}: Error finding container bd1db75a0f6b89871b87bcc95f04a2edb02439e89ab9b5e1a096304f7490db99: Status 404 returned error can't find the container with id bd1db75a0f6b89871b87bcc95f04a2edb02439e89ab9b5e1a096304f7490db99 Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.193820 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c124-account-create-update-8zqgg"] Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.912331 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t8qk9" event={"ID":"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb","Type":"ContainerStarted","Data":"5e6f4da7142b59c465f13069e8abffd32ebc3f04eeb6b88f772977ed584113c2"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.912651 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t8qk9" event={"ID":"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb","Type":"ContainerStarted","Data":"69cde6bb4086abefd3b421a4e0db78878c121c0be2c2674284c1675581361bdc"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.914124 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4581-account-create-update-w6tc2" event={"ID":"ec46ef58-a8e9-4354-b9a1-568535879964","Type":"ContainerStarted","Data":"c2cc081c6cf65b0ab460d8cc6143c9f0d5447d7db94e85de44cfe2121792b6a0"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.914172 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4581-account-create-update-w6tc2" event={"ID":"ec46ef58-a8e9-4354-b9a1-568535879964","Type":"ContainerStarted","Data":"c19e4f72f1c5f35690c3fb2bd2be44a6fa31ffdb302bbef941997a819e69b808"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.915499 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c124-account-create-update-8zqgg" event={"ID":"6ac843c1-9934-4711-aae6-7f6920596cb3","Type":"ContainerStarted","Data":"5d76ffad79a0d1339467174946f42bf027114aea75c47bb037057ca882b93f88"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.915529 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c124-account-create-update-8zqgg" event={"ID":"6ac843c1-9934-4711-aae6-7f6920596cb3","Type":"ContainerStarted","Data":"bd1db75a0f6b89871b87bcc95f04a2edb02439e89ab9b5e1a096304f7490db99"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.916961 4837 generic.go:334] "Generic (PLEG): container finished" podID="e51457d7-9619-4179-8f01-de6ffe5ceb82" containerID="76d8bcdb73b13d595e4c37de91e0da9193b0dfe32e04f54fbcbfc723d4f95d1f" exitCode=0 Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.917074 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mqgjq" event={"ID":"e51457d7-9619-4179-8f01-de6ffe5ceb82","Type":"ContainerDied","Data":"76d8bcdb73b13d595e4c37de91e0da9193b0dfe32e04f54fbcbfc723d4f95d1f"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.917091 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mqgjq" event={"ID":"e51457d7-9619-4179-8f01-de6ffe5ceb82","Type":"ContainerStarted","Data":"1ff4b0ecebe26fef949db006c82847907f81761a168e24a50335110289080520"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.919122 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec252a2a-f9a4-4894-991d-1a70f596519d","Type":"ContainerStarted","Data":"759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.920845 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a8004928-50bc-4db8-a701-4458c42bc776","Type":"ContainerStarted","Data":"132bbc5ebed761d4d1fc57b12552302464956cd861bf757a4ff02da13f8f52f6"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.920991 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.922309 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8886-account-create-update-ljcrw" event={"ID":"e397db42-b505-4447-87a2-4c12ed412f28","Type":"ContainerStarted","Data":"a98015db97ff0f5b37e30b833d1fc53c9a24f182fbe7bafcf011e2544e8dd80d"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.922341 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8886-account-create-update-ljcrw" event={"ID":"e397db42-b505-4447-87a2-4c12ed412f28","Type":"ContainerStarted","Data":"e4a50d17ef0d5b10ca2c0d2aeafd143cd9c5e63e31ce86c11aca1ecba4422049"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.923685 4837 generic.go:334] "Generic (PLEG): container finished" podID="ff8550d6-aacb-4848-928d-b1581a66d499" containerID="1c35974102ee9d500e8bf603751d70cec13d07eed47dcd00ec2798fe9d358807" exitCode=0 Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.923718 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-78jtc" event={"ID":"ff8550d6-aacb-4848-928d-b1581a66d499","Type":"ContainerDied","Data":"1c35974102ee9d500e8bf603751d70cec13d07eed47dcd00ec2798fe9d358807"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.935545 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-t8qk9" podStartSLOduration=2.93552743 podStartE2EDuration="2.93552743s" podCreationTimestamp="2026-03-13 12:07:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:58.925823125 +0000 UTC m=+1194.564089908" watchObservedRunningTime="2026-03-13 12:07:58.93552743 +0000 UTC m=+1194.573794193" Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.964091 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-8886-account-create-update-ljcrw" podStartSLOduration=2.964075428 podStartE2EDuration="2.964075428s" podCreationTimestamp="2026-03-13 12:07:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:58.955098345 +0000 UTC m=+1194.593365108" watchObservedRunningTime="2026-03-13 12:07:58.964075428 +0000 UTC m=+1194.602342191" Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.970830 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-4581-account-create-update-w6tc2" podStartSLOduration=2.970813599 podStartE2EDuration="2.970813599s" podCreationTimestamp="2026-03-13 12:07:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:58.9683083 +0000 UTC m=+1194.606575063" watchObservedRunningTime="2026-03-13 12:07:58.970813599 +0000 UTC m=+1194.609080362" Mar 13 12:07:59 crc kubenswrapper[4837]: I0313 12:07:59.019108 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-c124-account-create-update-8zqgg" podStartSLOduration=3.019083696 podStartE2EDuration="3.019083696s" podCreationTimestamp="2026-03-13 12:07:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:59.010150455 +0000 UTC m=+1194.648417238" watchObservedRunningTime="2026-03-13 12:07:59.019083696 +0000 UTC m=+1194.657350459" Mar 13 12:07:59 crc kubenswrapper[4837]: I0313 12:07:59.044894 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.044877446 podStartE2EDuration="4.044877446s" podCreationTimestamp="2026-03-13 12:07:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:59.036762052 +0000 UTC m=+1194.675028815" watchObservedRunningTime="2026-03-13 12:07:59.044877446 +0000 UTC m=+1194.683144199" Mar 13 12:07:59 crc kubenswrapper[4837]: I0313 12:07:59.405664 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="6f484085-7b83-46a8-80c2-b3ef6f8b8798" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.169:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 12:07:59 crc kubenswrapper[4837]: I0313 12:07:59.942130 4837 generic.go:334] "Generic (PLEG): container finished" podID="8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb" containerID="5e6f4da7142b59c465f13069e8abffd32ebc3f04eeb6b88f772977ed584113c2" exitCode=0 Mar 13 12:07:59 crc kubenswrapper[4837]: I0313 12:07:59.942200 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t8qk9" event={"ID":"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb","Type":"ContainerDied","Data":"5e6f4da7142b59c465f13069e8abffd32ebc3f04eeb6b88f772977ed584113c2"} Mar 13 12:07:59 crc kubenswrapper[4837]: I0313 12:07:59.944777 4837 generic.go:334] "Generic (PLEG): container finished" podID="6ac843c1-9934-4711-aae6-7f6920596cb3" containerID="5d76ffad79a0d1339467174946f42bf027114aea75c47bb037057ca882b93f88" exitCode=0 Mar 13 12:07:59 crc kubenswrapper[4837]: I0313 12:07:59.945039 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c124-account-create-update-8zqgg" event={"ID":"6ac843c1-9934-4711-aae6-7f6920596cb3","Type":"ContainerDied","Data":"5d76ffad79a0d1339467174946f42bf027114aea75c47bb037057ca882b93f88"} Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.145284 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556728-7n29h"] Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.147207 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556728-7n29h" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.149158 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.149444 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.149624 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.155583 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556728-7n29h"] Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.212383 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2d5k\" (UniqueName: \"kubernetes.io/projected/47ae408b-faad-4a52-ad09-428242645381-kube-api-access-l2d5k\") pod \"auto-csr-approver-29556728-7n29h\" (UID: \"47ae408b-faad-4a52-ad09-428242645381\") " pod="openshift-infra/auto-csr-approver-29556728-7n29h" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.313574 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2d5k\" (UniqueName: \"kubernetes.io/projected/47ae408b-faad-4a52-ad09-428242645381-kube-api-access-l2d5k\") pod \"auto-csr-approver-29556728-7n29h\" (UID: \"47ae408b-faad-4a52-ad09-428242645381\") " pod="openshift-infra/auto-csr-approver-29556728-7n29h" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.334410 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2d5k\" (UniqueName: \"kubernetes.io/projected/47ae408b-faad-4a52-ad09-428242645381-kube-api-access-l2d5k\") pod \"auto-csr-approver-29556728-7n29h\" (UID: \"47ae408b-faad-4a52-ad09-428242645381\") " pod="openshift-infra/auto-csr-approver-29556728-7n29h" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.432088 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-78jtc" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.524326 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mqgjq" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.541043 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556728-7n29h" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.623249 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e51457d7-9619-4179-8f01-de6ffe5ceb82-operator-scripts\") pod \"e51457d7-9619-4179-8f01-de6ffe5ceb82\" (UID: \"e51457d7-9619-4179-8f01-de6ffe5ceb82\") " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.623342 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff8550d6-aacb-4848-928d-b1581a66d499-operator-scripts\") pod \"ff8550d6-aacb-4848-928d-b1581a66d499\" (UID: \"ff8550d6-aacb-4848-928d-b1581a66d499\") " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.623396 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzhsn\" (UniqueName: \"kubernetes.io/projected/ff8550d6-aacb-4848-928d-b1581a66d499-kube-api-access-lzhsn\") pod \"ff8550d6-aacb-4848-928d-b1581a66d499\" (UID: \"ff8550d6-aacb-4848-928d-b1581a66d499\") " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.623423 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dmms\" (UniqueName: \"kubernetes.io/projected/e51457d7-9619-4179-8f01-de6ffe5ceb82-kube-api-access-9dmms\") pod \"e51457d7-9619-4179-8f01-de6ffe5ceb82\" (UID: \"e51457d7-9619-4179-8f01-de6ffe5ceb82\") " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.625605 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e51457d7-9619-4179-8f01-de6ffe5ceb82-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e51457d7-9619-4179-8f01-de6ffe5ceb82" (UID: "e51457d7-9619-4179-8f01-de6ffe5ceb82"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.627397 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff8550d6-aacb-4848-928d-b1581a66d499-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ff8550d6-aacb-4848-928d-b1581a66d499" (UID: "ff8550d6-aacb-4848-928d-b1581a66d499"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.636038 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff8550d6-aacb-4848-928d-b1581a66d499-kube-api-access-lzhsn" (OuterVolumeSpecName: "kube-api-access-lzhsn") pod "ff8550d6-aacb-4848-928d-b1581a66d499" (UID: "ff8550d6-aacb-4848-928d-b1581a66d499"). InnerVolumeSpecName "kube-api-access-lzhsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.636112 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e51457d7-9619-4179-8f01-de6ffe5ceb82-kube-api-access-9dmms" (OuterVolumeSpecName: "kube-api-access-9dmms") pod "e51457d7-9619-4179-8f01-de6ffe5ceb82" (UID: "e51457d7-9619-4179-8f01-de6ffe5ceb82"). InnerVolumeSpecName "kube-api-access-9dmms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.726035 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e51457d7-9619-4179-8f01-de6ffe5ceb82-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.726319 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff8550d6-aacb-4848-928d-b1581a66d499-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.726332 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzhsn\" (UniqueName: \"kubernetes.io/projected/ff8550d6-aacb-4848-928d-b1581a66d499-kube-api-access-lzhsn\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.726343 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dmms\" (UniqueName: \"kubernetes.io/projected/e51457d7-9619-4179-8f01-de6ffe5ceb82-kube-api-access-9dmms\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.762081 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.827373 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0173ba9-535a-435d-bc51-75c069e69e46-logs\") pod \"f0173ba9-535a-435d-bc51-75c069e69e46\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.827473 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-combined-ca-bundle\") pod \"f0173ba9-535a-435d-bc51-75c069e69e46\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.827496 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"f0173ba9-535a-435d-bc51-75c069e69e46\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.827522 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-config-data\") pod \"f0173ba9-535a-435d-bc51-75c069e69e46\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.827549 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-public-tls-certs\") pod \"f0173ba9-535a-435d-bc51-75c069e69e46\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.827571 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-scripts\") pod \"f0173ba9-535a-435d-bc51-75c069e69e46\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.827615 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0173ba9-535a-435d-bc51-75c069e69e46-httpd-run\") pod \"f0173ba9-535a-435d-bc51-75c069e69e46\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.827669 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvzd7\" (UniqueName: \"kubernetes.io/projected/f0173ba9-535a-435d-bc51-75c069e69e46-kube-api-access-gvzd7\") pod \"f0173ba9-535a-435d-bc51-75c069e69e46\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.828492 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0173ba9-535a-435d-bc51-75c069e69e46-logs" (OuterVolumeSpecName: "logs") pod "f0173ba9-535a-435d-bc51-75c069e69e46" (UID: "f0173ba9-535a-435d-bc51-75c069e69e46"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.830822 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0173ba9-535a-435d-bc51-75c069e69e46-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f0173ba9-535a-435d-bc51-75c069e69e46" (UID: "f0173ba9-535a-435d-bc51-75c069e69e46"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.835613 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0173ba9-535a-435d-bc51-75c069e69e46-kube-api-access-gvzd7" (OuterVolumeSpecName: "kube-api-access-gvzd7") pod "f0173ba9-535a-435d-bc51-75c069e69e46" (UID: "f0173ba9-535a-435d-bc51-75c069e69e46"). InnerVolumeSpecName "kube-api-access-gvzd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.839104 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-scripts" (OuterVolumeSpecName: "scripts") pod "f0173ba9-535a-435d-bc51-75c069e69e46" (UID: "f0173ba9-535a-435d-bc51-75c069e69e46"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.843125 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "f0173ba9-535a-435d-bc51-75c069e69e46" (UID: "f0173ba9-535a-435d-bc51-75c069e69e46"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.886956 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0173ba9-535a-435d-bc51-75c069e69e46" (UID: "f0173ba9-535a-435d-bc51-75c069e69e46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.907551 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-config-data" (OuterVolumeSpecName: "config-data") pod "f0173ba9-535a-435d-bc51-75c069e69e46" (UID: "f0173ba9-535a-435d-bc51-75c069e69e46"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.929283 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.929331 4837 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0173ba9-535a-435d-bc51-75c069e69e46-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.929347 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvzd7\" (UniqueName: \"kubernetes.io/projected/f0173ba9-535a-435d-bc51-75c069e69e46-kube-api-access-gvzd7\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.929358 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0173ba9-535a-435d-bc51-75c069e69e46-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.929367 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.929387 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.929398 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.929734 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f0173ba9-535a-435d-bc51-75c069e69e46" (UID: "f0173ba9-535a-435d-bc51-75c069e69e46"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.958894 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.966746 4837 generic.go:334] "Generic (PLEG): container finished" podID="e397db42-b505-4447-87a2-4c12ed412f28" containerID="a98015db97ff0f5b37e30b833d1fc53c9a24f182fbe7bafcf011e2544e8dd80d" exitCode=0 Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.966844 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8886-account-create-update-ljcrw" event={"ID":"e397db42-b505-4447-87a2-4c12ed412f28","Type":"ContainerDied","Data":"a98015db97ff0f5b37e30b833d1fc53c9a24f182fbe7bafcf011e2544e8dd80d"} Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.972631 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-78jtc" event={"ID":"ff8550d6-aacb-4848-928d-b1581a66d499","Type":"ContainerDied","Data":"3a6b7d9266c6c68f69a431eb6b6c17756be54da729f94a82611fe08a1b1a72be"} Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.972851 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a6b7d9266c6c68f69a431eb6b6c17756be54da729f94a82611fe08a1b1a72be" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.972985 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-78jtc" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.975305 4837 generic.go:334] "Generic (PLEG): container finished" podID="ec46ef58-a8e9-4354-b9a1-568535879964" containerID="c2cc081c6cf65b0ab460d8cc6143c9f0d5447d7db94e85de44cfe2121792b6a0" exitCode=0 Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.975375 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4581-account-create-update-w6tc2" event={"ID":"ec46ef58-a8e9-4354-b9a1-568535879964","Type":"ContainerDied","Data":"c2cc081c6cf65b0ab460d8cc6143c9f0d5447d7db94e85de44cfe2121792b6a0"} Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.978608 4837 generic.go:334] "Generic (PLEG): container finished" podID="f0173ba9-535a-435d-bc51-75c069e69e46" containerID="06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6" exitCode=0 Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.978741 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.978981 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0173ba9-535a-435d-bc51-75c069e69e46","Type":"ContainerDied","Data":"06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6"} Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.979043 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0173ba9-535a-435d-bc51-75c069e69e46","Type":"ContainerDied","Data":"a0b7b975f0a853ab5afecbe29fde34fc4210243637b54de91d96895e00f81e30"} Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.979065 4837 scope.go:117] "RemoveContainer" containerID="06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.987005 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mqgjq" event={"ID":"e51457d7-9619-4179-8f01-de6ffe5ceb82","Type":"ContainerDied","Data":"1ff4b0ecebe26fef949db006c82847907f81761a168e24a50335110289080520"} Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.987041 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ff4b0ecebe26fef949db006c82847907f81761a168e24a50335110289080520" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.987110 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mqgjq" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.998518 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec252a2a-f9a4-4894-991d-1a70f596519d","Type":"ContainerStarted","Data":"2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201"} Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.025865 4837 scope.go:117] "RemoveContainer" containerID="5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.032259 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.032303 4837 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.101493 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.115792 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.133411 4837 scope.go:117] "RemoveContainer" containerID="06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6" Mar 13 12:08:01 crc kubenswrapper[4837]: E0313 12:08:01.138782 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6\": container with ID starting with 06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6 not found: ID does not exist" containerID="06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.138823 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6"} err="failed to get container status \"06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6\": rpc error: code = NotFound desc = could not find container \"06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6\": container with ID starting with 06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6 not found: ID does not exist" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.138846 4837 scope.go:117] "RemoveContainer" containerID="5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482" Mar 13 12:08:01 crc kubenswrapper[4837]: E0313 12:08:01.141866 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482\": container with ID starting with 5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482 not found: ID does not exist" containerID="5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.141917 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482"} err="failed to get container status \"5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482\": rpc error: code = NotFound desc = could not find container \"5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482\": container with ID starting with 5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482 not found: ID does not exist" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.146371 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556728-7n29h"] Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.156179 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:08:01 crc kubenswrapper[4837]: E0313 12:08:01.156606 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0173ba9-535a-435d-bc51-75c069e69e46" containerName="glance-log" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.156622 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0173ba9-535a-435d-bc51-75c069e69e46" containerName="glance-log" Mar 13 12:08:01 crc kubenswrapper[4837]: E0313 12:08:01.156660 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e51457d7-9619-4179-8f01-de6ffe5ceb82" containerName="mariadb-database-create" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.156669 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e51457d7-9619-4179-8f01-de6ffe5ceb82" containerName="mariadb-database-create" Mar 13 12:08:01 crc kubenswrapper[4837]: E0313 12:08:01.156683 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0173ba9-535a-435d-bc51-75c069e69e46" containerName="glance-httpd" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.156689 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0173ba9-535a-435d-bc51-75c069e69e46" containerName="glance-httpd" Mar 13 12:08:01 crc kubenswrapper[4837]: E0313 12:08:01.156707 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff8550d6-aacb-4848-928d-b1581a66d499" containerName="mariadb-database-create" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.156712 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8550d6-aacb-4848-928d-b1581a66d499" containerName="mariadb-database-create" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.156877 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0173ba9-535a-435d-bc51-75c069e69e46" containerName="glance-log" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.156894 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e51457d7-9619-4179-8f01-de6ffe5ceb82" containerName="mariadb-database-create" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.156904 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff8550d6-aacb-4848-928d-b1581a66d499" containerName="mariadb-database-create" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.156917 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0173ba9-535a-435d-bc51-75c069e69e46" containerName="glance-httpd" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.157858 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.160210 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.160784 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.174028 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.350593 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3f87d89-35d5-4dc0-9c37-5297718a9351-scripts\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.350829 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3f87d89-35d5-4dc0-9c37-5297718a9351-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.350894 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.350918 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8hs6\" (UniqueName: \"kubernetes.io/projected/d3f87d89-35d5-4dc0-9c37-5297718a9351-kube-api-access-d8hs6\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.351230 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3f87d89-35d5-4dc0-9c37-5297718a9351-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.351268 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3f87d89-35d5-4dc0-9c37-5297718a9351-logs\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.351301 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f87d89-35d5-4dc0-9c37-5297718a9351-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.351336 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f87d89-35d5-4dc0-9c37-5297718a9351-config-data\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.453448 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.453502 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8hs6\" (UniqueName: \"kubernetes.io/projected/d3f87d89-35d5-4dc0-9c37-5297718a9351-kube-api-access-d8hs6\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.453545 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3f87d89-35d5-4dc0-9c37-5297718a9351-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.453576 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3f87d89-35d5-4dc0-9c37-5297718a9351-logs\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.453616 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f87d89-35d5-4dc0-9c37-5297718a9351-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.453682 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f87d89-35d5-4dc0-9c37-5297718a9351-config-data\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.453708 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3f87d89-35d5-4dc0-9c37-5297718a9351-scripts\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.453767 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3f87d89-35d5-4dc0-9c37-5297718a9351-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.464195 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3f87d89-35d5-4dc0-9c37-5297718a9351-logs\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.464500 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.468901 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f87d89-35d5-4dc0-9c37-5297718a9351-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.474344 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3f87d89-35d5-4dc0-9c37-5297718a9351-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.474597 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3f87d89-35d5-4dc0-9c37-5297718a9351-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.483496 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3f87d89-35d5-4dc0-9c37-5297718a9351-scripts\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.485175 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f87d89-35d5-4dc0-9c37-5297718a9351-config-data\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.530749 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8hs6\" (UniqueName: \"kubernetes.io/projected/d3f87d89-35d5-4dc0-9c37-5297718a9351-kube-api-access-d8hs6\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.541456 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.597311 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c124-account-create-update-8zqgg" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.609316 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t8qk9" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.762591 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ac843c1-9934-4711-aae6-7f6920596cb3-operator-scripts\") pod \"6ac843c1-9934-4711-aae6-7f6920596cb3\" (UID: \"6ac843c1-9934-4711-aae6-7f6920596cb3\") " Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.762744 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzzgd\" (UniqueName: \"kubernetes.io/projected/6ac843c1-9934-4711-aae6-7f6920596cb3-kube-api-access-nzzgd\") pod \"6ac843c1-9934-4711-aae6-7f6920596cb3\" (UID: \"6ac843c1-9934-4711-aae6-7f6920596cb3\") " Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.762808 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfjj7\" (UniqueName: \"kubernetes.io/projected/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb-kube-api-access-tfjj7\") pod \"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb\" (UID: \"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb\") " Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.763169 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb-operator-scripts\") pod \"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb\" (UID: \"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb\") " Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.764207 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ac843c1-9934-4711-aae6-7f6920596cb3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6ac843c1-9934-4711-aae6-7f6920596cb3" (UID: "6ac843c1-9934-4711-aae6-7f6920596cb3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.764998 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb" (UID: "8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.765071 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ac843c1-9934-4711-aae6-7f6920596cb3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.767666 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ac843c1-9934-4711-aae6-7f6920596cb3-kube-api-access-nzzgd" (OuterVolumeSpecName: "kube-api-access-nzzgd") pod "6ac843c1-9934-4711-aae6-7f6920596cb3" (UID: "6ac843c1-9934-4711-aae6-7f6920596cb3"). InnerVolumeSpecName "kube-api-access-nzzgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.770405 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb-kube-api-access-tfjj7" (OuterVolumeSpecName: "kube-api-access-tfjj7") pod "8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb" (UID: "8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb"). InnerVolumeSpecName "kube-api-access-tfjj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.794360 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.866733 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzzgd\" (UniqueName: \"kubernetes.io/projected/6ac843c1-9934-4711-aae6-7f6920596cb3-kube-api-access-nzzgd\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.866779 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfjj7\" (UniqueName: \"kubernetes.io/projected/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb-kube-api-access-tfjj7\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.866793 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.888741 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.978784 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-httpd-config\") pod \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.978928 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s56d9\" (UniqueName: \"kubernetes.io/projected/073acab9-3b9b-432a-aef7-b59bad9fa6ea-kube-api-access-s56d9\") pod \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.978987 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-combined-ca-bundle\") pod \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.979060 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-ovndb-tls-certs\") pod \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.979171 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-config\") pod \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.041794 4837 generic.go:334] "Generic (PLEG): container finished" podID="073acab9-3b9b-432a-aef7-b59bad9fa6ea" containerID="592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458" exitCode=0 Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.042216 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67f9f46cf4-9cvcg" event={"ID":"073acab9-3b9b-432a-aef7-b59bad9fa6ea","Type":"ContainerDied","Data":"592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458"} Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.042246 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67f9f46cf4-9cvcg" event={"ID":"073acab9-3b9b-432a-aef7-b59bad9fa6ea","Type":"ContainerDied","Data":"f8cb990fe37777f793f0250c14cdfa0b903194e81a71cae32c4012805c32b7c7"} Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.042273 4837 scope.go:117] "RemoveContainer" containerID="5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.042460 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.051958 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t8qk9" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.053911 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "073acab9-3b9b-432a-aef7-b59bad9fa6ea" (UID: "073acab9-3b9b-432a-aef7-b59bad9fa6ea"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.054304 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/073acab9-3b9b-432a-aef7-b59bad9fa6ea-kube-api-access-s56d9" (OuterVolumeSpecName: "kube-api-access-s56d9") pod "073acab9-3b9b-432a-aef7-b59bad9fa6ea" (UID: "073acab9-3b9b-432a-aef7-b59bad9fa6ea"). InnerVolumeSpecName "kube-api-access-s56d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.062206 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t8qk9" event={"ID":"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb","Type":"ContainerDied","Data":"69cde6bb4086abefd3b421a4e0db78878c121c0be2c2674284c1675581361bdc"} Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.073815 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69cde6bb4086abefd3b421a4e0db78878c121c0be2c2674284c1675581361bdc" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.073867 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556728-7n29h" event={"ID":"47ae408b-faad-4a52-ad09-428242645381","Type":"ContainerStarted","Data":"f3da93ab4a472ba7116a0beb08f63b2c302111f8cbb9bf5768b3b8124101f12f"} Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.079027 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c124-account-create-update-8zqgg" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.079153 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c124-account-create-update-8zqgg" event={"ID":"6ac843c1-9934-4711-aae6-7f6920596cb3","Type":"ContainerDied","Data":"bd1db75a0f6b89871b87bcc95f04a2edb02439e89ab9b5e1a096304f7490db99"} Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.079182 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd1db75a0f6b89871b87bcc95f04a2edb02439e89ab9b5e1a096304f7490db99" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.083737 4837 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.083859 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s56d9\" (UniqueName: \"kubernetes.io/projected/073acab9-3b9b-432a-aef7-b59bad9fa6ea-kube-api-access-s56d9\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.134656 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "073acab9-3b9b-432a-aef7-b59bad9fa6ea" (UID: "073acab9-3b9b-432a-aef7-b59bad9fa6ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.140710 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-config" (OuterVolumeSpecName: "config") pod "073acab9-3b9b-432a-aef7-b59bad9fa6ea" (UID: "073acab9-3b9b-432a-aef7-b59bad9fa6ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.154923 4837 scope.go:117] "RemoveContainer" containerID="592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.185371 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "073acab9-3b9b-432a-aef7-b59bad9fa6ea" (UID: "073acab9-3b9b-432a-aef7-b59bad9fa6ea"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.208029 4837 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.208189 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.208302 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.232122 4837 scope.go:117] "RemoveContainer" containerID="5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f" Mar 13 12:08:02 crc kubenswrapper[4837]: E0313 12:08:02.239110 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f\": container with ID starting with 5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f not found: ID does not exist" containerID="5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.239155 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f"} err="failed to get container status \"5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f\": rpc error: code = NotFound desc = could not find container \"5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f\": container with ID starting with 5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f not found: ID does not exist" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.239183 4837 scope.go:117] "RemoveContainer" containerID="592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458" Mar 13 12:08:02 crc kubenswrapper[4837]: E0313 12:08:02.242210 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458\": container with ID starting with 592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458 not found: ID does not exist" containerID="592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.242337 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458"} err="failed to get container status \"592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458\": rpc error: code = NotFound desc = could not find container \"592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458\": container with ID starting with 592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458 not found: ID does not exist" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.502093 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-67f9f46cf4-9cvcg"] Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.539449 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-67f9f46cf4-9cvcg"] Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.598224 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:08:02 crc kubenswrapper[4837]: W0313 12:08:02.605444 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3f87d89_35d5_4dc0_9c37_5297718a9351.slice/crio-5b07cd89ede2fe2c1fb21826f3293ae30dafc157e24ff9974fed53f7d9792a41 WatchSource:0}: Error finding container 5b07cd89ede2fe2c1fb21826f3293ae30dafc157e24ff9974fed53f7d9792a41: Status 404 returned error can't find the container with id 5b07cd89ede2fe2c1fb21826f3293ae30dafc157e24ff9974fed53f7d9792a41 Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.720835 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8886-account-create-update-ljcrw" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.740242 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4581-account-create-update-w6tc2" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.813573 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.815519 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.825989 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e397db42-b505-4447-87a2-4c12ed412f28-operator-scripts\") pod \"e397db42-b505-4447-87a2-4c12ed412f28\" (UID: \"e397db42-b505-4447-87a2-4c12ed412f28\") " Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.826115 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec46ef58-a8e9-4354-b9a1-568535879964-operator-scripts\") pod \"ec46ef58-a8e9-4354-b9a1-568535879964\" (UID: \"ec46ef58-a8e9-4354-b9a1-568535879964\") " Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.826172 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8wqn\" (UniqueName: \"kubernetes.io/projected/e397db42-b505-4447-87a2-4c12ed412f28-kube-api-access-q8wqn\") pod \"e397db42-b505-4447-87a2-4c12ed412f28\" (UID: \"e397db42-b505-4447-87a2-4c12ed412f28\") " Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.826237 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcs7r\" (UniqueName: \"kubernetes.io/projected/ec46ef58-a8e9-4354-b9a1-568535879964-kube-api-access-bcs7r\") pod \"ec46ef58-a8e9-4354-b9a1-568535879964\" (UID: \"ec46ef58-a8e9-4354-b9a1-568535879964\") " Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.828357 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec46ef58-a8e9-4354-b9a1-568535879964-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ec46ef58-a8e9-4354-b9a1-568535879964" (UID: "ec46ef58-a8e9-4354-b9a1-568535879964"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.828493 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e397db42-b505-4447-87a2-4c12ed412f28-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e397db42-b505-4447-87a2-4c12ed412f28" (UID: "e397db42-b505-4447-87a2-4c12ed412f28"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.840208 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec46ef58-a8e9-4354-b9a1-568535879964-kube-api-access-bcs7r" (OuterVolumeSpecName: "kube-api-access-bcs7r") pod "ec46ef58-a8e9-4354-b9a1-568535879964" (UID: "ec46ef58-a8e9-4354-b9a1-568535879964"). InnerVolumeSpecName "kube-api-access-bcs7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.843535 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e397db42-b505-4447-87a2-4c12ed412f28-kube-api-access-q8wqn" (OuterVolumeSpecName: "kube-api-access-q8wqn") pod "e397db42-b505-4447-87a2-4c12ed412f28" (UID: "e397db42-b505-4447-87a2-4c12ed412f28"). InnerVolumeSpecName "kube-api-access-q8wqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.860608 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.860901 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" containerName="glance-log" containerID="cri-o://4c6f80cedfefe6ca3ffa3fd1f8e5bca2af1a1e041ef15266c27ebfeb6b6939ec" gracePeriod=30 Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.861069 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" containerName="glance-httpd" containerID="cri-o://a7e9d992e509609ea914f80658069ef20b3e4ab7548f88fd1489567b1ca63a1f" gracePeriod=30 Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.928694 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec46ef58-a8e9-4354-b9a1-568535879964-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.928731 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8wqn\" (UniqueName: \"kubernetes.io/projected/e397db42-b505-4447-87a2-4c12ed412f28-kube-api-access-q8wqn\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.928745 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcs7r\" (UniqueName: \"kubernetes.io/projected/ec46ef58-a8e9-4354-b9a1-568535879964-kube-api-access-bcs7r\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.928758 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e397db42-b505-4447-87a2-4c12ed412f28-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.944382 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.030567 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a28d7a5-22a2-460a-a08c-8eb484e6c382-scripts\") pod \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.030713 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-horizon-tls-certs\") pod \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.030736 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvsmz\" (UniqueName: \"kubernetes.io/projected/2a28d7a5-22a2-460a-a08c-8eb484e6c382-kube-api-access-wvsmz\") pod \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.030805 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-combined-ca-bundle\") pod \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.030838 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-horizon-secret-key\") pod \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.030877 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a28d7a5-22a2-460a-a08c-8eb484e6c382-config-data\") pod \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.030949 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a28d7a5-22a2-460a-a08c-8eb484e6c382-logs\") pod \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.031711 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a28d7a5-22a2-460a-a08c-8eb484e6c382-logs" (OuterVolumeSpecName: "logs") pod "2a28d7a5-22a2-460a-a08c-8eb484e6c382" (UID: "2a28d7a5-22a2-460a-a08c-8eb484e6c382"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.035287 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a28d7a5-22a2-460a-a08c-8eb484e6c382-kube-api-access-wvsmz" (OuterVolumeSpecName: "kube-api-access-wvsmz") pod "2a28d7a5-22a2-460a-a08c-8eb484e6c382" (UID: "2a28d7a5-22a2-460a-a08c-8eb484e6c382"). InnerVolumeSpecName "kube-api-access-wvsmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.042613 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2a28d7a5-22a2-460a-a08c-8eb484e6c382" (UID: "2a28d7a5-22a2-460a-a08c-8eb484e6c382"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.065778 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="073acab9-3b9b-432a-aef7-b59bad9fa6ea" path="/var/lib/kubelet/pods/073acab9-3b9b-432a-aef7-b59bad9fa6ea/volumes" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.066721 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0173ba9-535a-435d-bc51-75c069e69e46" path="/var/lib/kubelet/pods/f0173ba9-535a-435d-bc51-75c069e69e46/volumes" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.074998 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a28d7a5-22a2-460a-a08c-8eb484e6c382-config-data" (OuterVolumeSpecName: "config-data") pod "2a28d7a5-22a2-460a-a08c-8eb484e6c382" (UID: "2a28d7a5-22a2-460a-a08c-8eb484e6c382"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.102606 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d3f87d89-35d5-4dc0-9c37-5297718a9351","Type":"ContainerStarted","Data":"5b07cd89ede2fe2c1fb21826f3293ae30dafc157e24ff9974fed53f7d9792a41"} Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.104421 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8886-account-create-update-ljcrw" event={"ID":"e397db42-b505-4447-87a2-4c12ed412f28","Type":"ContainerDied","Data":"e4a50d17ef0d5b10ca2c0d2aeafd143cd9c5e63e31ce86c11aca1ecba4422049"} Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.104451 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4a50d17ef0d5b10ca2c0d2aeafd143cd9c5e63e31ce86c11aca1ecba4422049" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.104502 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8886-account-create-update-ljcrw" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.118331 4837 generic.go:334] "Generic (PLEG): container finished" podID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" containerID="4c6f80cedfefe6ca3ffa3fd1f8e5bca2af1a1e041ef15266c27ebfeb6b6939ec" exitCode=143 Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.118470 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9fdb2289-943a-4078-ab5f-cab9a7b4faf1","Type":"ContainerDied","Data":"4c6f80cedfefe6ca3ffa3fd1f8e5bca2af1a1e041ef15266c27ebfeb6b6939ec"} Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.125284 4837 generic.go:334] "Generic (PLEG): container finished" podID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerID="92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654" exitCode=137 Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.125357 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5596f9dfb8-m9bxb" event={"ID":"2a28d7a5-22a2-460a-a08c-8eb484e6c382","Type":"ContainerDied","Data":"92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654"} Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.125388 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5596f9dfb8-m9bxb" event={"ID":"2a28d7a5-22a2-460a-a08c-8eb484e6c382","Type":"ContainerDied","Data":"60985fc2aa747df3481773b902df6591e8f7e0a9aaa937b1d8ccf7c3a2e33f6e"} Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.125408 4837 scope.go:117] "RemoveContainer" containerID="7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.125528 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.129462 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a28d7a5-22a2-460a-a08c-8eb484e6c382" (UID: "2a28d7a5-22a2-460a-a08c-8eb484e6c382"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.133084 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a28d7a5-22a2-460a-a08c-8eb484e6c382-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.133139 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvsmz\" (UniqueName: \"kubernetes.io/projected/2a28d7a5-22a2-460a-a08c-8eb484e6c382-kube-api-access-wvsmz\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.133158 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.133170 4837 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.133181 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a28d7a5-22a2-460a-a08c-8eb484e6c382-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.144775 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4581-account-create-update-w6tc2" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.144773 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4581-account-create-update-w6tc2" event={"ID":"ec46ef58-a8e9-4354-b9a1-568535879964","Type":"ContainerDied","Data":"c19e4f72f1c5f35690c3fb2bd2be44a6fa31ffdb302bbef941997a819e69b808"} Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.144910 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c19e4f72f1c5f35690c3fb2bd2be44a6fa31ffdb302bbef941997a819e69b808" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.147553 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556728-7n29h" event={"ID":"47ae408b-faad-4a52-ad09-428242645381","Type":"ContainerStarted","Data":"d2184d47fa1ce72a82da97184468ccee1cece609eb9ab8fb1194680ef9c8ea21"} Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.158612 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec252a2a-f9a4-4894-991d-1a70f596519d","Type":"ContainerStarted","Data":"4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07"} Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.158761 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="ceilometer-central-agent" containerID="cri-o://f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a" gracePeriod=30 Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.158880 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="proxy-httpd" containerID="cri-o://4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07" gracePeriod=30 Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.158932 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="sg-core" containerID="cri-o://2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201" gracePeriod=30 Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.158976 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="ceilometer-notification-agent" containerID="cri-o://759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4" gracePeriod=30 Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.181290 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "2a28d7a5-22a2-460a-a08c-8eb484e6c382" (UID: "2a28d7a5-22a2-460a-a08c-8eb484e6c382"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.198202 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556728-7n29h" podStartSLOduration=2.255125133 podStartE2EDuration="3.198178236s" podCreationTimestamp="2026-03-13 12:08:00 +0000 UTC" firstStartedPulling="2026-03-13 12:08:01.132725043 +0000 UTC m=+1196.770991806" lastFinishedPulling="2026-03-13 12:08:02.075778146 +0000 UTC m=+1197.714044909" observedRunningTime="2026-03-13 12:08:03.16427505 +0000 UTC m=+1198.802541823" watchObservedRunningTime="2026-03-13 12:08:03.198178236 +0000 UTC m=+1198.836445009" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.207307 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a28d7a5-22a2-460a-a08c-8eb484e6c382-scripts" (OuterVolumeSpecName: "scripts") pod "2a28d7a5-22a2-460a-a08c-8eb484e6c382" (UID: "2a28d7a5-22a2-460a-a08c-8eb484e6c382"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.210153 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.553587896 podStartE2EDuration="8.210135992s" podCreationTimestamp="2026-03-13 12:07:55 +0000 UTC" firstStartedPulling="2026-03-13 12:07:56.214871619 +0000 UTC m=+1191.853138382" lastFinishedPulling="2026-03-13 12:08:01.871419715 +0000 UTC m=+1197.509686478" observedRunningTime="2026-03-13 12:08:03.187379787 +0000 UTC m=+1198.825646550" watchObservedRunningTime="2026-03-13 12:08:03.210135992 +0000 UTC m=+1198.848402755" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.235343 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a28d7a5-22a2-460a-a08c-8eb484e6c382-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.235370 4837 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.367982 4837 scope.go:117] "RemoveContainer" containerID="92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.482201 4837 scope.go:117] "RemoveContainer" containerID="7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb" Mar 13 12:08:03 crc kubenswrapper[4837]: E0313 12:08:03.482690 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb\": container with ID starting with 7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb not found: ID does not exist" containerID="7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.482728 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb"} err="failed to get container status \"7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb\": rpc error: code = NotFound desc = could not find container \"7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb\": container with ID starting with 7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb not found: ID does not exist" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.482755 4837 scope.go:117] "RemoveContainer" containerID="92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654" Mar 13 12:08:03 crc kubenswrapper[4837]: E0313 12:08:03.483301 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654\": container with ID starting with 92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654 not found: ID does not exist" containerID="92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.483349 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654"} err="failed to get container status \"92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654\": rpc error: code = NotFound desc = could not find container \"92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654\": container with ID starting with 92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654 not found: ID does not exist" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.512069 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5596f9dfb8-m9bxb"] Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.525709 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5596f9dfb8-m9bxb"] Mar 13 12:08:04 crc kubenswrapper[4837]: I0313 12:08:04.183828 4837 generic.go:334] "Generic (PLEG): container finished" podID="47ae408b-faad-4a52-ad09-428242645381" containerID="d2184d47fa1ce72a82da97184468ccee1cece609eb9ab8fb1194680ef9c8ea21" exitCode=0 Mar 13 12:08:04 crc kubenswrapper[4837]: I0313 12:08:04.184181 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556728-7n29h" event={"ID":"47ae408b-faad-4a52-ad09-428242645381","Type":"ContainerDied","Data":"d2184d47fa1ce72a82da97184468ccee1cece609eb9ab8fb1194680ef9c8ea21"} Mar 13 12:08:04 crc kubenswrapper[4837]: I0313 12:08:04.187720 4837 generic.go:334] "Generic (PLEG): container finished" podID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerID="4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07" exitCode=0 Mar 13 12:08:04 crc kubenswrapper[4837]: I0313 12:08:04.187750 4837 generic.go:334] "Generic (PLEG): container finished" podID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerID="2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201" exitCode=2 Mar 13 12:08:04 crc kubenswrapper[4837]: I0313 12:08:04.187759 4837 generic.go:334] "Generic (PLEG): container finished" podID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerID="759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4" exitCode=0 Mar 13 12:08:04 crc kubenswrapper[4837]: I0313 12:08:04.187794 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec252a2a-f9a4-4894-991d-1a70f596519d","Type":"ContainerDied","Data":"4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07"} Mar 13 12:08:04 crc kubenswrapper[4837]: I0313 12:08:04.187817 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec252a2a-f9a4-4894-991d-1a70f596519d","Type":"ContainerDied","Data":"2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201"} Mar 13 12:08:04 crc kubenswrapper[4837]: I0313 12:08:04.187829 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec252a2a-f9a4-4894-991d-1a70f596519d","Type":"ContainerDied","Data":"759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4"} Mar 13 12:08:04 crc kubenswrapper[4837]: I0313 12:08:04.191199 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d3f87d89-35d5-4dc0-9c37-5297718a9351","Type":"ContainerStarted","Data":"db9e0234f3793624350d9bf2860efc958e3f44554f4b5ac4ae84cf488c1ce7e4"} Mar 13 12:08:04 crc kubenswrapper[4837]: I0313 12:08:04.191365 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d3f87d89-35d5-4dc0-9c37-5297718a9351","Type":"ContainerStarted","Data":"fdfc517ecfc54d1ab6ae838e5f84fb794c7ede4598c042e712fca287be710aad"} Mar 13 12:08:04 crc kubenswrapper[4837]: I0313 12:08:04.228433 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.228410309 podStartE2EDuration="3.228410309s" podCreationTimestamp="2026-03-13 12:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:04.221471811 +0000 UTC m=+1199.859738584" watchObservedRunningTime="2026-03-13 12:08:04.228410309 +0000 UTC m=+1199.866677072" Mar 13 12:08:05 crc kubenswrapper[4837]: I0313 12:08:05.063947 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" path="/var/lib/kubelet/pods/2a28d7a5-22a2-460a-a08c-8eb484e6c382/volumes" Mar 13 12:08:05 crc kubenswrapper[4837]: I0313 12:08:05.484258 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:08:05 crc kubenswrapper[4837]: I0313 12:08:05.484631 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:08:05 crc kubenswrapper[4837]: I0313 12:08:05.545574 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556728-7n29h" Mar 13 12:08:05 crc kubenswrapper[4837]: I0313 12:08:05.576861 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2d5k\" (UniqueName: \"kubernetes.io/projected/47ae408b-faad-4a52-ad09-428242645381-kube-api-access-l2d5k\") pod \"47ae408b-faad-4a52-ad09-428242645381\" (UID: \"47ae408b-faad-4a52-ad09-428242645381\") " Mar 13 12:08:05 crc kubenswrapper[4837]: I0313 12:08:05.582838 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47ae408b-faad-4a52-ad09-428242645381-kube-api-access-l2d5k" (OuterVolumeSpecName: "kube-api-access-l2d5k") pod "47ae408b-faad-4a52-ad09-428242645381" (UID: "47ae408b-faad-4a52-ad09-428242645381"). InnerVolumeSpecName "kube-api-access-l2d5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:05 crc kubenswrapper[4837]: I0313 12:08:05.679369 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2d5k\" (UniqueName: \"kubernetes.io/projected/47ae408b-faad-4a52-ad09-428242645381-kube-api-access-l2d5k\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.221627 4837 generic.go:334] "Generic (PLEG): container finished" podID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" containerID="a7e9d992e509609ea914f80658069ef20b3e4ab7548f88fd1489567b1ca63a1f" exitCode=0 Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.221669 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9fdb2289-943a-4078-ab5f-cab9a7b4faf1","Type":"ContainerDied","Data":"a7e9d992e509609ea914f80658069ef20b3e4ab7548f88fd1489567b1ca63a1f"} Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.224010 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556728-7n29h" event={"ID":"47ae408b-faad-4a52-ad09-428242645381","Type":"ContainerDied","Data":"f3da93ab4a472ba7116a0beb08f63b2c302111f8cbb9bf5768b3b8124101f12f"} Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.224048 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3da93ab4a472ba7116a0beb08f63b2c302111f8cbb9bf5768b3b8124101f12f" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.224055 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556728-7n29h" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.242178 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556722-h599x"] Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.251091 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556722-h599x"] Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.615342 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.804366 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-f6gwd"] Mar 13 12:08:06 crc kubenswrapper[4837]: E0313 12:08:06.804745 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec46ef58-a8e9-4354-b9a1-568535879964" containerName="mariadb-account-create-update" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.804759 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec46ef58-a8e9-4354-b9a1-568535879964" containerName="mariadb-account-create-update" Mar 13 12:08:06 crc kubenswrapper[4837]: E0313 12:08:06.804778 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" containerName="glance-log" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.804785 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" containerName="glance-log" Mar 13 12:08:06 crc kubenswrapper[4837]: E0313 12:08:06.804797 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac843c1-9934-4711-aae6-7f6920596cb3" containerName="mariadb-account-create-update" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.804803 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac843c1-9934-4711-aae6-7f6920596cb3" containerName="mariadb-account-create-update" Mar 13 12:08:06 crc kubenswrapper[4837]: E0313 12:08:06.804814 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb" containerName="mariadb-database-create" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.804820 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb" containerName="mariadb-database-create" Mar 13 12:08:06 crc kubenswrapper[4837]: E0313 12:08:06.804833 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.804839 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon" Mar 13 12:08:06 crc kubenswrapper[4837]: E0313 12:08:06.804851 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon-log" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.804857 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon-log" Mar 13 12:08:06 crc kubenswrapper[4837]: E0313 12:08:06.804869 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="073acab9-3b9b-432a-aef7-b59bad9fa6ea" containerName="neutron-httpd" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.804875 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="073acab9-3b9b-432a-aef7-b59bad9fa6ea" containerName="neutron-httpd" Mar 13 12:08:06 crc kubenswrapper[4837]: E0313 12:08:06.804884 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47ae408b-faad-4a52-ad09-428242645381" containerName="oc" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.804890 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="47ae408b-faad-4a52-ad09-428242645381" containerName="oc" Mar 13 12:08:06 crc kubenswrapper[4837]: E0313 12:08:06.804896 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e397db42-b505-4447-87a2-4c12ed412f28" containerName="mariadb-account-create-update" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.804902 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e397db42-b505-4447-87a2-4c12ed412f28" containerName="mariadb-account-create-update" Mar 13 12:08:06 crc kubenswrapper[4837]: E0313 12:08:06.804914 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="073acab9-3b9b-432a-aef7-b59bad9fa6ea" containerName="neutron-api" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.804920 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="073acab9-3b9b-432a-aef7-b59bad9fa6ea" containerName="neutron-api" Mar 13 12:08:06 crc kubenswrapper[4837]: E0313 12:08:06.804933 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" containerName="glance-httpd" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.804938 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" containerName="glance-httpd" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.805090 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon-log" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.805107 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" containerName="glance-log" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.805116 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec46ef58-a8e9-4354-b9a1-568535879964" containerName="mariadb-account-create-update" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.805128 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="47ae408b-faad-4a52-ad09-428242645381" containerName="oc" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.805135 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.805144 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac843c1-9934-4711-aae6-7f6920596cb3" containerName="mariadb-account-create-update" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.805152 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb" containerName="mariadb-database-create" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.805160 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="073acab9-3b9b-432a-aef7-b59bad9fa6ea" containerName="neutron-httpd" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.805170 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="073acab9-3b9b-432a-aef7-b59bad9fa6ea" containerName="neutron-api" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.805180 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e397db42-b505-4447-87a2-4c12ed412f28" containerName="mariadb-account-create-update" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.805192 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" containerName="glance-httpd" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.805749 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.807164 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-combined-ca-bundle\") pod \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.807221 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cp4n\" (UniqueName: \"kubernetes.io/projected/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-kube-api-access-4cp4n\") pod \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.807244 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-config-data\") pod \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.807323 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-logs\") pod \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.807377 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-scripts\") pod \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.807399 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-httpd-run\") pod \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.807502 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.807540 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-internal-tls-certs\") pod \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.808136 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-logs" (OuterVolumeSpecName: "logs") pod "9fdb2289-943a-4078-ab5f-cab9a7b4faf1" (UID: "9fdb2289-943a-4078-ab5f-cab9a7b4faf1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.809952 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.810104 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.810471 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9fdb2289-943a-4078-ab5f-cab9a7b4faf1" (UID: "9fdb2289-943a-4078-ab5f-cab9a7b4faf1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.812495 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qctwr" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.830056 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-f6gwd"] Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.842537 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-kube-api-access-4cp4n" (OuterVolumeSpecName: "kube-api-access-4cp4n") pod "9fdb2289-943a-4078-ab5f-cab9a7b4faf1" (UID: "9fdb2289-943a-4078-ab5f-cab9a7b4faf1"). InnerVolumeSpecName "kube-api-access-4cp4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.846976 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-scripts" (OuterVolumeSpecName: "scripts") pod "9fdb2289-943a-4078-ab5f-cab9a7b4faf1" (UID: "9fdb2289-943a-4078-ab5f-cab9a7b4faf1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.851852 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "9fdb2289-943a-4078-ab5f-cab9a7b4faf1" (UID: "9fdb2289-943a-4078-ab5f-cab9a7b4faf1"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.908771 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9fdb2289-943a-4078-ab5f-cab9a7b4faf1" (UID: "9fdb2289-943a-4078-ab5f-cab9a7b4faf1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.908821 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-f6gwd\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.908878 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-config-data\") pod \"nova-cell0-conductor-db-sync-f6gwd\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.908957 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqhf2\" (UniqueName: \"kubernetes.io/projected/5d6d5bbe-7e5b-4645-95c4-af868cba3244-kube-api-access-gqhf2\") pod \"nova-cell0-conductor-db-sync-f6gwd\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.909007 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-scripts\") pod \"nova-cell0-conductor-db-sync-f6gwd\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.910208 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.910239 4837 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.910261 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.910270 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.910279 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cp4n\" (UniqueName: \"kubernetes.io/projected/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-kube-api-access-4cp4n\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.910288 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.956550 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.973851 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9fdb2289-943a-4078-ab5f-cab9a7b4faf1" (UID: "9fdb2289-943a-4078-ab5f-cab9a7b4faf1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.977169 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-config-data" (OuterVolumeSpecName: "config-data") pod "9fdb2289-943a-4078-ab5f-cab9a7b4faf1" (UID: "9fdb2289-943a-4078-ab5f-cab9a7b4faf1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.011411 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-scripts\") pod \"nova-cell0-conductor-db-sync-f6gwd\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.011466 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-f6gwd\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.011510 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-config-data\") pod \"nova-cell0-conductor-db-sync-f6gwd\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.011585 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqhf2\" (UniqueName: \"kubernetes.io/projected/5d6d5bbe-7e5b-4645-95c4-af868cba3244-kube-api-access-gqhf2\") pod \"nova-cell0-conductor-db-sync-f6gwd\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.011629 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.011656 4837 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.011666 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.026374 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-scripts\") pod \"nova-cell0-conductor-db-sync-f6gwd\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.027459 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-config-data\") pod \"nova-cell0-conductor-db-sync-f6gwd\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.028480 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-f6gwd\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.034165 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqhf2\" (UniqueName: \"kubernetes.io/projected/5d6d5bbe-7e5b-4645-95c4-af868cba3244-kube-api-access-gqhf2\") pod \"nova-cell0-conductor-db-sync-f6gwd\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.040306 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.075265 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be033789-27be-444d-b72e-7abbbb34b285" path="/var/lib/kubelet/pods/be033789-27be-444d-b72e-7abbbb34b285/volumes" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.239436 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9fdb2289-943a-4078-ab5f-cab9a7b4faf1","Type":"ContainerDied","Data":"9af1f1b6bae1b057a7c5b2be284aed718dd1bd53fd4267a097ec24a461a2d852"} Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.239784 4837 scope.go:117] "RemoveContainer" containerID="a7e9d992e509609ea914f80658069ef20b3e4ab7548f88fd1489567b1ca63a1f" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.239546 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.313584 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.334710 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.354705 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.356587 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.359218 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.360100 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.370557 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.395507 4837 scope.go:117] "RemoveContainer" containerID="4c6f80cedfefe6ca3ffa3fd1f8e5bca2af1a1e041ef15266c27ebfeb6b6939ec" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.527131 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f3b003-127f-414f-877a-8f7df2872049-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.527400 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.527432 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48pn7\" (UniqueName: \"kubernetes.io/projected/d0f3b003-127f-414f-877a-8f7df2872049-kube-api-access-48pn7\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.527465 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0f3b003-127f-414f-877a-8f7df2872049-logs\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.527580 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f3b003-127f-414f-877a-8f7df2872049-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.527616 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f3b003-127f-414f-877a-8f7df2872049-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.527663 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d0f3b003-127f-414f-877a-8f7df2872049-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.527701 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0f3b003-127f-414f-877a-8f7df2872049-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.587161 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-f6gwd"] Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.594785 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.629547 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f3b003-127f-414f-877a-8f7df2872049-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.629604 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f3b003-127f-414f-877a-8f7df2872049-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.629670 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d0f3b003-127f-414f-877a-8f7df2872049-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.629721 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0f3b003-127f-414f-877a-8f7df2872049-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.629766 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f3b003-127f-414f-877a-8f7df2872049-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.629816 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.629854 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48pn7\" (UniqueName: \"kubernetes.io/projected/d0f3b003-127f-414f-877a-8f7df2872049-kube-api-access-48pn7\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.629894 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0f3b003-127f-414f-877a-8f7df2872049-logs\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.630115 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d0f3b003-127f-414f-877a-8f7df2872049-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.630255 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0f3b003-127f-414f-877a-8f7df2872049-logs\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.630698 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.638770 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f3b003-127f-414f-877a-8f7df2872049-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.639514 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f3b003-127f-414f-877a-8f7df2872049-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.640548 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f3b003-127f-414f-877a-8f7df2872049-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.643506 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0f3b003-127f-414f-877a-8f7df2872049-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.652535 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48pn7\" (UniqueName: \"kubernetes.io/projected/d0f3b003-127f-414f-877a-8f7df2872049-kube-api-access-48pn7\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.683031 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.739397 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.934951 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-sg-core-conf-yaml\") pod \"ec252a2a-f9a4-4894-991d-1a70f596519d\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.935244 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-combined-ca-bundle\") pod \"ec252a2a-f9a4-4894-991d-1a70f596519d\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.935420 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-scripts\") pod \"ec252a2a-f9a4-4894-991d-1a70f596519d\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.935528 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-config-data\") pod \"ec252a2a-f9a4-4894-991d-1a70f596519d\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.935669 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jmwm\" (UniqueName: \"kubernetes.io/projected/ec252a2a-f9a4-4894-991d-1a70f596519d-kube-api-access-9jmwm\") pod \"ec252a2a-f9a4-4894-991d-1a70f596519d\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.935819 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec252a2a-f9a4-4894-991d-1a70f596519d-log-httpd\") pod \"ec252a2a-f9a4-4894-991d-1a70f596519d\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.935895 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec252a2a-f9a4-4894-991d-1a70f596519d-run-httpd\") pod \"ec252a2a-f9a4-4894-991d-1a70f596519d\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.936123 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec252a2a-f9a4-4894-991d-1a70f596519d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ec252a2a-f9a4-4894-991d-1a70f596519d" (UID: "ec252a2a-f9a4-4894-991d-1a70f596519d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.936245 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec252a2a-f9a4-4894-991d-1a70f596519d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ec252a2a-f9a4-4894-991d-1a70f596519d" (UID: "ec252a2a-f9a4-4894-991d-1a70f596519d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.936616 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec252a2a-f9a4-4894-991d-1a70f596519d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.936722 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec252a2a-f9a4-4894-991d-1a70f596519d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.943782 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-scripts" (OuterVolumeSpecName: "scripts") pod "ec252a2a-f9a4-4894-991d-1a70f596519d" (UID: "ec252a2a-f9a4-4894-991d-1a70f596519d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.946583 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec252a2a-f9a4-4894-991d-1a70f596519d-kube-api-access-9jmwm" (OuterVolumeSpecName: "kube-api-access-9jmwm") pod "ec252a2a-f9a4-4894-991d-1a70f596519d" (UID: "ec252a2a-f9a4-4894-991d-1a70f596519d"). InnerVolumeSpecName "kube-api-access-9jmwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.974255 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.006810 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ec252a2a-f9a4-4894-991d-1a70f596519d" (UID: "ec252a2a-f9a4-4894-991d-1a70f596519d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.038319 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.038353 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.038363 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jmwm\" (UniqueName: \"kubernetes.io/projected/ec252a2a-f9a4-4894-991d-1a70f596519d-kube-api-access-9jmwm\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.049761 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec252a2a-f9a4-4894-991d-1a70f596519d" (UID: "ec252a2a-f9a4-4894-991d-1a70f596519d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.063719 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-config-data" (OuterVolumeSpecName: "config-data") pod "ec252a2a-f9a4-4894-991d-1a70f596519d" (UID: "ec252a2a-f9a4-4894-991d-1a70f596519d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.139737 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.139778 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.221004 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.265351 4837 generic.go:334] "Generic (PLEG): container finished" podID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerID="f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a" exitCode=0 Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.265426 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec252a2a-f9a4-4894-991d-1a70f596519d","Type":"ContainerDied","Data":"f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a"} Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.265454 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec252a2a-f9a4-4894-991d-1a70f596519d","Type":"ContainerDied","Data":"8963d958bbfe2f25190f6d4efa0bcd7a6fe7c107dfdb4e163c3ec794ab189d07"} Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.265475 4837 scope.go:117] "RemoveContainer" containerID="4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.265587 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.300076 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-f6gwd" event={"ID":"5d6d5bbe-7e5b-4645-95c4-af868cba3244","Type":"ContainerStarted","Data":"6144ca86ef9d0d9f5e120027d710fea9eb400bcc8f2a208f56ef661ebbec1f34"} Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.310794 4837 scope.go:117] "RemoveContainer" containerID="2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.321467 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.339496 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.363260 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:08 crc kubenswrapper[4837]: E0313 12:08:08.366671 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="proxy-httpd" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.366711 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="proxy-httpd" Mar 13 12:08:08 crc kubenswrapper[4837]: E0313 12:08:08.366773 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="ceilometer-central-agent" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.366783 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="ceilometer-central-agent" Mar 13 12:08:08 crc kubenswrapper[4837]: E0313 12:08:08.366811 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="ceilometer-notification-agent" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.366818 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="ceilometer-notification-agent" Mar 13 12:08:08 crc kubenswrapper[4837]: E0313 12:08:08.366834 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="sg-core" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.366842 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="sg-core" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.367713 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="ceilometer-central-agent" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.367774 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="ceilometer-notification-agent" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.367793 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="sg-core" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.367821 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="proxy-httpd" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.377768 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.384213 4837 scope.go:117] "RemoveContainer" containerID="759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.389098 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.389471 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.411450 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.458489 4837 scope.go:117] "RemoveContainer" containerID="f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.497692 4837 scope.go:117] "RemoveContainer" containerID="4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07" Mar 13 12:08:08 crc kubenswrapper[4837]: E0313 12:08:08.498052 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07\": container with ID starting with 4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07 not found: ID does not exist" containerID="4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.498083 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07"} err="failed to get container status \"4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07\": rpc error: code = NotFound desc = could not find container \"4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07\": container with ID starting with 4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07 not found: ID does not exist" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.498107 4837 scope.go:117] "RemoveContainer" containerID="2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201" Mar 13 12:08:08 crc kubenswrapper[4837]: E0313 12:08:08.498613 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201\": container with ID starting with 2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201 not found: ID does not exist" containerID="2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.498682 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201"} err="failed to get container status \"2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201\": rpc error: code = NotFound desc = could not find container \"2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201\": container with ID starting with 2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201 not found: ID does not exist" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.498718 4837 scope.go:117] "RemoveContainer" containerID="759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4" Mar 13 12:08:08 crc kubenswrapper[4837]: E0313 12:08:08.499270 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4\": container with ID starting with 759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4 not found: ID does not exist" containerID="759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.499325 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4"} err="failed to get container status \"759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4\": rpc error: code = NotFound desc = could not find container \"759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4\": container with ID starting with 759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4 not found: ID does not exist" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.499363 4837 scope.go:117] "RemoveContainer" containerID="f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a" Mar 13 12:08:08 crc kubenswrapper[4837]: E0313 12:08:08.499836 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a\": container with ID starting with f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a not found: ID does not exist" containerID="f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.499890 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a"} err="failed to get container status \"f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a\": rpc error: code = NotFound desc = could not find container \"f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a\": container with ID starting with f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a not found: ID does not exist" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.567777 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-config-data\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.567868 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-run-httpd\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.567905 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-scripts\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.567927 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.567983 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.568027 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6jrg\" (UniqueName: \"kubernetes.io/projected/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-kube-api-access-p6jrg\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.568066 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-log-httpd\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.572477 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:08:08 crc kubenswrapper[4837]: W0313 12:08:08.599543 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0f3b003_127f_414f_877a_8f7df2872049.slice/crio-eeae7f380050bdcbd8635a5445d8a2141e4fd6a8936c78d957ddea8b41ad3793 WatchSource:0}: Error finding container eeae7f380050bdcbd8635a5445d8a2141e4fd6a8936c78d957ddea8b41ad3793: Status 404 returned error can't find the container with id eeae7f380050bdcbd8635a5445d8a2141e4fd6a8936c78d957ddea8b41ad3793 Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.669452 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-config-data\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.669538 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-run-httpd\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.669584 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-scripts\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.669646 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.669709 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.669752 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6jrg\" (UniqueName: \"kubernetes.io/projected/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-kube-api-access-p6jrg\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.669881 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-log-httpd\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.670121 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-run-httpd\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.670241 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-log-httpd\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.681686 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.684169 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.685921 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-scripts\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.690627 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-config-data\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.696401 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6jrg\" (UniqueName: \"kubernetes.io/projected/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-kube-api-access-p6jrg\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.737473 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:08:09 crc kubenswrapper[4837]: I0313 12:08:09.059000 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" path="/var/lib/kubelet/pods/9fdb2289-943a-4078-ab5f-cab9a7b4faf1/volumes" Mar 13 12:08:09 crc kubenswrapper[4837]: I0313 12:08:09.060119 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" path="/var/lib/kubelet/pods/ec252a2a-f9a4-4894-991d-1a70f596519d/volumes" Mar 13 12:08:09 crc kubenswrapper[4837]: I0313 12:08:09.249603 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:09 crc kubenswrapper[4837]: I0313 12:08:09.324853 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4fdb7-b384-4d2d-9bd4-4d33884e828c","Type":"ContainerStarted","Data":"a06c9f38e731b74f1c733f23ddeb517873bc8240455c11686148ecef7617ff18"} Mar 13 12:08:09 crc kubenswrapper[4837]: I0313 12:08:09.326955 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d0f3b003-127f-414f-877a-8f7df2872049","Type":"ContainerStarted","Data":"44826e4def7b4e6d925e64be0fe446443f064dc0048fccb1c82bbe3a889f12c6"} Mar 13 12:08:09 crc kubenswrapper[4837]: I0313 12:08:09.326989 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d0f3b003-127f-414f-877a-8f7df2872049","Type":"ContainerStarted","Data":"eeae7f380050bdcbd8635a5445d8a2141e4fd6a8936c78d957ddea8b41ad3793"} Mar 13 12:08:10 crc kubenswrapper[4837]: I0313 12:08:10.338521 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d0f3b003-127f-414f-877a-8f7df2872049","Type":"ContainerStarted","Data":"7a5e4afce92c029361a67abdc5df9ad06561515c47e3e0277455062bb70f9bca"} Mar 13 12:08:10 crc kubenswrapper[4837]: I0313 12:08:10.343905 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4fdb7-b384-4d2d-9bd4-4d33884e828c","Type":"ContainerStarted","Data":"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be"} Mar 13 12:08:10 crc kubenswrapper[4837]: I0313 12:08:10.380787 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.380752175 podStartE2EDuration="3.380752175s" podCreationTimestamp="2026-03-13 12:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:10.35802159 +0000 UTC m=+1205.996288353" watchObservedRunningTime="2026-03-13 12:08:10.380752175 +0000 UTC m=+1206.019018938" Mar 13 12:08:11 crc kubenswrapper[4837]: I0313 12:08:11.217844 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:11 crc kubenswrapper[4837]: I0313 12:08:11.362371 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4fdb7-b384-4d2d-9bd4-4d33884e828c","Type":"ContainerStarted","Data":"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8"} Mar 13 12:08:11 crc kubenswrapper[4837]: I0313 12:08:11.795074 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 12:08:11 crc kubenswrapper[4837]: I0313 12:08:11.795155 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 12:08:11 crc kubenswrapper[4837]: I0313 12:08:11.841940 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 12:08:11 crc kubenswrapper[4837]: I0313 12:08:11.845618 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 12:08:12 crc kubenswrapper[4837]: I0313 12:08:12.378269 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4fdb7-b384-4d2d-9bd4-4d33884e828c","Type":"ContainerStarted","Data":"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4"} Mar 13 12:08:12 crc kubenswrapper[4837]: I0313 12:08:12.378575 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 12:08:12 crc kubenswrapper[4837]: I0313 12:08:12.378594 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 12:08:14 crc kubenswrapper[4837]: I0313 12:08:14.399369 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 12:08:14 crc kubenswrapper[4837]: I0313 12:08:14.399716 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 12:08:14 crc kubenswrapper[4837]: I0313 12:08:14.441503 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 12:08:14 crc kubenswrapper[4837]: I0313 12:08:14.652920 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 12:08:17 crc kubenswrapper[4837]: I0313 12:08:17.975780 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:17 crc kubenswrapper[4837]: I0313 12:08:17.976444 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:18 crc kubenswrapper[4837]: I0313 12:08:18.026404 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:18 crc kubenswrapper[4837]: I0313 12:08:18.026714 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:18 crc kubenswrapper[4837]: I0313 12:08:18.462773 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:18 crc kubenswrapper[4837]: I0313 12:08:18.463569 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:19 crc kubenswrapper[4837]: I0313 12:08:19.470593 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-f6gwd" event={"ID":"5d6d5bbe-7e5b-4645-95c4-af868cba3244","Type":"ContainerStarted","Data":"5337a2212bdc3b1dbf150fa95afc9aaae420bfce797da10558e36cb08bd46c77"} Mar 13 12:08:19 crc kubenswrapper[4837]: I0313 12:08:19.474678 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4fdb7-b384-4d2d-9bd4-4d33884e828c","Type":"ContainerStarted","Data":"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f"} Mar 13 12:08:19 crc kubenswrapper[4837]: I0313 12:08:19.474952 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="ceilometer-central-agent" containerID="cri-o://8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be" gracePeriod=30 Mar 13 12:08:19 crc kubenswrapper[4837]: I0313 12:08:19.475059 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 12:08:19 crc kubenswrapper[4837]: I0313 12:08:19.475107 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="proxy-httpd" containerID="cri-o://fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f" gracePeriod=30 Mar 13 12:08:19 crc kubenswrapper[4837]: I0313 12:08:19.475143 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="sg-core" containerID="cri-o://670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4" gracePeriod=30 Mar 13 12:08:19 crc kubenswrapper[4837]: I0313 12:08:19.475177 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="ceilometer-notification-agent" containerID="cri-o://f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8" gracePeriod=30 Mar 13 12:08:19 crc kubenswrapper[4837]: I0313 12:08:19.492309 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-f6gwd" podStartSLOduration=2.241168462 podStartE2EDuration="13.492284707s" podCreationTimestamp="2026-03-13 12:08:06 +0000 UTC" firstStartedPulling="2026-03-13 12:08:07.594509062 +0000 UTC m=+1203.232775825" lastFinishedPulling="2026-03-13 12:08:18.845625297 +0000 UTC m=+1214.483892070" observedRunningTime="2026-03-13 12:08:19.483858303 +0000 UTC m=+1215.122125066" watchObservedRunningTime="2026-03-13 12:08:19.492284707 +0000 UTC m=+1215.130551480" Mar 13 12:08:19 crc kubenswrapper[4837]: I0313 12:08:19.532974 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.93763292 podStartE2EDuration="11.532945716s" podCreationTimestamp="2026-03-13 12:08:08 +0000 UTC" firstStartedPulling="2026-03-13 12:08:09.250385004 +0000 UTC m=+1204.888651767" lastFinishedPulling="2026-03-13 12:08:18.8456978 +0000 UTC m=+1214.483964563" observedRunningTime="2026-03-13 12:08:19.505294556 +0000 UTC m=+1215.143561319" watchObservedRunningTime="2026-03-13 12:08:19.532945716 +0000 UTC m=+1215.171212509" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.366577 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.485496 4837 generic.go:334] "Generic (PLEG): container finished" podID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerID="fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f" exitCode=0 Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.485539 4837 generic.go:334] "Generic (PLEG): container finished" podID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerID="670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4" exitCode=2 Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.485552 4837 generic.go:334] "Generic (PLEG): container finished" podID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerID="f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8" exitCode=0 Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.485561 4837 generic.go:334] "Generic (PLEG): container finished" podID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerID="8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be" exitCode=0 Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.485680 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.485693 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.485790 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.486590 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4fdb7-b384-4d2d-9bd4-4d33884e828c","Type":"ContainerDied","Data":"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f"} Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.486622 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4fdb7-b384-4d2d-9bd4-4d33884e828c","Type":"ContainerDied","Data":"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4"} Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.486637 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4fdb7-b384-4d2d-9bd4-4d33884e828c","Type":"ContainerDied","Data":"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8"} Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.486647 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4fdb7-b384-4d2d-9bd4-4d33884e828c","Type":"ContainerDied","Data":"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be"} Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.486725 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4fdb7-b384-4d2d-9bd4-4d33884e828c","Type":"ContainerDied","Data":"a06c9f38e731b74f1c733f23ddeb517873bc8240455c11686148ecef7617ff18"} Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.486742 4837 scope.go:117] "RemoveContainer" containerID="fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.509814 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-scripts\") pod \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.509898 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-config-data\") pod \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.509927 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-sg-core-conf-yaml\") pod \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.510006 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-run-httpd\") pod \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.510077 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6jrg\" (UniqueName: \"kubernetes.io/projected/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-kube-api-access-p6jrg\") pod \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.510108 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-combined-ca-bundle\") pod \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.510177 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-log-httpd\") pod \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.510845 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "08c4fdb7-b384-4d2d-9bd4-4d33884e828c" (UID: "08c4fdb7-b384-4d2d-9bd4-4d33884e828c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.514049 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "08c4fdb7-b384-4d2d-9bd4-4d33884e828c" (UID: "08c4fdb7-b384-4d2d-9bd4-4d33884e828c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.515880 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-scripts" (OuterVolumeSpecName: "scripts") pod "08c4fdb7-b384-4d2d-9bd4-4d33884e828c" (UID: "08c4fdb7-b384-4d2d-9bd4-4d33884e828c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.523668 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-kube-api-access-p6jrg" (OuterVolumeSpecName: "kube-api-access-p6jrg") pod "08c4fdb7-b384-4d2d-9bd4-4d33884e828c" (UID: "08c4fdb7-b384-4d2d-9bd4-4d33884e828c"). InnerVolumeSpecName "kube-api-access-p6jrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.533950 4837 scope.go:117] "RemoveContainer" containerID="670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.551013 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "08c4fdb7-b384-4d2d-9bd4-4d33884e828c" (UID: "08c4fdb7-b384-4d2d-9bd4-4d33884e828c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.554808 4837 scope.go:117] "RemoveContainer" containerID="f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.560744 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.563122 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.593862 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08c4fdb7-b384-4d2d-9bd4-4d33884e828c" (UID: "08c4fdb7-b384-4d2d-9bd4-4d33884e828c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.604919 4837 scope.go:117] "RemoveContainer" containerID="8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.614283 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.614569 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.614580 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.614588 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6jrg\" (UniqueName: \"kubernetes.io/projected/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-kube-api-access-p6jrg\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.614597 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.614605 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.639031 4837 scope.go:117] "RemoveContainer" containerID="fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f" Mar 13 12:08:20 crc kubenswrapper[4837]: E0313 12:08:20.639948 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f\": container with ID starting with fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f not found: ID does not exist" containerID="fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.639977 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f"} err="failed to get container status \"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f\": rpc error: code = NotFound desc = could not find container \"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f\": container with ID starting with fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.639997 4837 scope.go:117] "RemoveContainer" containerID="670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4" Mar 13 12:08:20 crc kubenswrapper[4837]: E0313 12:08:20.640352 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4\": container with ID starting with 670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4 not found: ID does not exist" containerID="670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.640378 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4"} err="failed to get container status \"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4\": rpc error: code = NotFound desc = could not find container \"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4\": container with ID starting with 670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4 not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.640391 4837 scope.go:117] "RemoveContainer" containerID="f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8" Mar 13 12:08:20 crc kubenswrapper[4837]: E0313 12:08:20.640599 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8\": container with ID starting with f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8 not found: ID does not exist" containerID="f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.640613 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8"} err="failed to get container status \"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8\": rpc error: code = NotFound desc = could not find container \"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8\": container with ID starting with f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8 not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.640625 4837 scope.go:117] "RemoveContainer" containerID="8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be" Mar 13 12:08:20 crc kubenswrapper[4837]: E0313 12:08:20.640854 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be\": container with ID starting with 8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be not found: ID does not exist" containerID="8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.640868 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be"} err="failed to get container status \"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be\": rpc error: code = NotFound desc = could not find container \"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be\": container with ID starting with 8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.640881 4837 scope.go:117] "RemoveContainer" containerID="fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.642611 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f"} err="failed to get container status \"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f\": rpc error: code = NotFound desc = could not find container \"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f\": container with ID starting with fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.642634 4837 scope.go:117] "RemoveContainer" containerID="670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.643135 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4"} err="failed to get container status \"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4\": rpc error: code = NotFound desc = could not find container \"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4\": container with ID starting with 670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4 not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.643151 4837 scope.go:117] "RemoveContainer" containerID="f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.643370 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8"} err="failed to get container status \"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8\": rpc error: code = NotFound desc = could not find container \"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8\": container with ID starting with f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8 not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.643397 4837 scope.go:117] "RemoveContainer" containerID="8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.645012 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be"} err="failed to get container status \"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be\": rpc error: code = NotFound desc = could not find container \"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be\": container with ID starting with 8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.645041 4837 scope.go:117] "RemoveContainer" containerID="fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.645600 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f"} err="failed to get container status \"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f\": rpc error: code = NotFound desc = could not find container \"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f\": container with ID starting with fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.645713 4837 scope.go:117] "RemoveContainer" containerID="670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.646164 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4"} err="failed to get container status \"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4\": rpc error: code = NotFound desc = could not find container \"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4\": container with ID starting with 670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4 not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.646217 4837 scope.go:117] "RemoveContainer" containerID="f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.646505 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8"} err="failed to get container status \"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8\": rpc error: code = NotFound desc = could not find container \"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8\": container with ID starting with f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8 not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.646531 4837 scope.go:117] "RemoveContainer" containerID="8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.646826 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be"} err="failed to get container status \"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be\": rpc error: code = NotFound desc = could not find container \"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be\": container with ID starting with 8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.646855 4837 scope.go:117] "RemoveContainer" containerID="fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.647095 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f"} err="failed to get container status \"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f\": rpc error: code = NotFound desc = could not find container \"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f\": container with ID starting with fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.647119 4837 scope.go:117] "RemoveContainer" containerID="670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.647302 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4"} err="failed to get container status \"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4\": rpc error: code = NotFound desc = could not find container \"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4\": container with ID starting with 670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4 not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.647341 4837 scope.go:117] "RemoveContainer" containerID="f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.647574 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8"} err="failed to get container status \"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8\": rpc error: code = NotFound desc = could not find container \"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8\": container with ID starting with f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8 not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.647601 4837 scope.go:117] "RemoveContainer" containerID="8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.647776 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be"} err="failed to get container status \"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be\": rpc error: code = NotFound desc = could not find container \"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be\": container with ID starting with 8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.649839 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-config-data" (OuterVolumeSpecName: "config-data") pod "08c4fdb7-b384-4d2d-9bd4-4d33884e828c" (UID: "08c4fdb7-b384-4d2d-9bd4-4d33884e828c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.716778 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.819321 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.829114 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.847148 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:20 crc kubenswrapper[4837]: E0313 12:08:20.847798 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="ceilometer-central-agent" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.847816 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="ceilometer-central-agent" Mar 13 12:08:20 crc kubenswrapper[4837]: E0313 12:08:20.847842 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="proxy-httpd" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.847850 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="proxy-httpd" Mar 13 12:08:20 crc kubenswrapper[4837]: E0313 12:08:20.847863 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="ceilometer-notification-agent" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.847870 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="ceilometer-notification-agent" Mar 13 12:08:20 crc kubenswrapper[4837]: E0313 12:08:20.847885 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="sg-core" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.847892 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="sg-core" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.848075 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="ceilometer-notification-agent" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.848091 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="ceilometer-central-agent" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.848103 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="sg-core" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.848118 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="proxy-httpd" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.851121 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.853773 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.853903 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.861686 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.023507 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w9zs\" (UniqueName: \"kubernetes.io/projected/f9e01084-6025-433d-99d8-36d2c555c685-kube-api-access-7w9zs\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.023567 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9e01084-6025-433d-99d8-36d2c555c685-run-httpd\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.023637 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-scripts\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.023863 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9e01084-6025-433d-99d8-36d2c555c685-log-httpd\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.023923 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.024050 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-config-data\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.024088 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.066991 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" path="/var/lib/kubelet/pods/08c4fdb7-b384-4d2d-9bd4-4d33884e828c/volumes" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.126052 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-scripts\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.126134 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9e01084-6025-433d-99d8-36d2c555c685-log-httpd\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.126159 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.126206 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-config-data\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.126224 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.126415 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w9zs\" (UniqueName: \"kubernetes.io/projected/f9e01084-6025-433d-99d8-36d2c555c685-kube-api-access-7w9zs\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.126439 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9e01084-6025-433d-99d8-36d2c555c685-run-httpd\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.127960 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9e01084-6025-433d-99d8-36d2c555c685-run-httpd\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.138951 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9e01084-6025-433d-99d8-36d2c555c685-log-httpd\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.140441 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.140893 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-config-data\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.141232 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.143338 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-scripts\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.146258 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w9zs\" (UniqueName: \"kubernetes.io/projected/f9e01084-6025-433d-99d8-36d2c555c685-kube-api-access-7w9zs\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.347861 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.806551 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:22 crc kubenswrapper[4837]: I0313 12:08:22.514862 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9e01084-6025-433d-99d8-36d2c555c685","Type":"ContainerStarted","Data":"f59c8e71a086a620d4220bad0b44420b5cdb08a3b6ad9da08898e9871162295e"} Mar 13 12:08:23 crc kubenswrapper[4837]: I0313 12:08:23.524085 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9e01084-6025-433d-99d8-36d2c555c685","Type":"ContainerStarted","Data":"8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29"} Mar 13 12:08:23 crc kubenswrapper[4837]: I0313 12:08:23.524154 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9e01084-6025-433d-99d8-36d2c555c685","Type":"ContainerStarted","Data":"6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c"} Mar 13 12:08:24 crc kubenswrapper[4837]: I0313 12:08:24.534170 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9e01084-6025-433d-99d8-36d2c555c685","Type":"ContainerStarted","Data":"e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00"} Mar 13 12:08:26 crc kubenswrapper[4837]: I0313 12:08:26.552500 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9e01084-6025-433d-99d8-36d2c555c685","Type":"ContainerStarted","Data":"49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f"} Mar 13 12:08:26 crc kubenswrapper[4837]: I0313 12:08:26.553058 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 12:08:26 crc kubenswrapper[4837]: I0313 12:08:26.567730 4837 scope.go:117] "RemoveContainer" containerID="bf1679f5dae4d4dbf23dda0605e595646a6c9aa5a55d2f380823eb7ec590b836" Mar 13 12:08:26 crc kubenswrapper[4837]: I0313 12:08:26.575302 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.087078206 podStartE2EDuration="6.575282277s" podCreationTimestamp="2026-03-13 12:08:20 +0000 UTC" firstStartedPulling="2026-03-13 12:08:21.809530232 +0000 UTC m=+1217.447796995" lastFinishedPulling="2026-03-13 12:08:25.297734303 +0000 UTC m=+1220.936001066" observedRunningTime="2026-03-13 12:08:26.568947108 +0000 UTC m=+1222.207213881" watchObservedRunningTime="2026-03-13 12:08:26.575282277 +0000 UTC m=+1222.213549040" Mar 13 12:08:27 crc kubenswrapper[4837]: I0313 12:08:27.169279 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:28 crc kubenswrapper[4837]: I0313 12:08:28.566301 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="ceilometer-central-agent" containerID="cri-o://8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29" gracePeriod=30 Mar 13 12:08:28 crc kubenswrapper[4837]: I0313 12:08:28.566351 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="sg-core" containerID="cri-o://e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00" gracePeriod=30 Mar 13 12:08:28 crc kubenswrapper[4837]: I0313 12:08:28.566377 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="ceilometer-notification-agent" containerID="cri-o://6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c" gracePeriod=30 Mar 13 12:08:28 crc kubenswrapper[4837]: I0313 12:08:28.566401 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="proxy-httpd" containerID="cri-o://49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f" gracePeriod=30 Mar 13 12:08:29 crc kubenswrapper[4837]: I0313 12:08:29.576481 4837 generic.go:334] "Generic (PLEG): container finished" podID="f9e01084-6025-433d-99d8-36d2c555c685" containerID="49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f" exitCode=0 Mar 13 12:08:29 crc kubenswrapper[4837]: I0313 12:08:29.576517 4837 generic.go:334] "Generic (PLEG): container finished" podID="f9e01084-6025-433d-99d8-36d2c555c685" containerID="e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00" exitCode=2 Mar 13 12:08:29 crc kubenswrapper[4837]: I0313 12:08:29.576530 4837 generic.go:334] "Generic (PLEG): container finished" podID="f9e01084-6025-433d-99d8-36d2c555c685" containerID="6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c" exitCode=0 Mar 13 12:08:29 crc kubenswrapper[4837]: I0313 12:08:29.576558 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9e01084-6025-433d-99d8-36d2c555c685","Type":"ContainerDied","Data":"49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f"} Mar 13 12:08:29 crc kubenswrapper[4837]: I0313 12:08:29.576601 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9e01084-6025-433d-99d8-36d2c555c685","Type":"ContainerDied","Data":"e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00"} Mar 13 12:08:29 crc kubenswrapper[4837]: I0313 12:08:29.576612 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9e01084-6025-433d-99d8-36d2c555c685","Type":"ContainerDied","Data":"6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c"} Mar 13 12:08:29 crc kubenswrapper[4837]: I0313 12:08:29.578198 4837 generic.go:334] "Generic (PLEG): container finished" podID="5d6d5bbe-7e5b-4645-95c4-af868cba3244" containerID="5337a2212bdc3b1dbf150fa95afc9aaae420bfce797da10558e36cb08bd46c77" exitCode=0 Mar 13 12:08:29 crc kubenswrapper[4837]: I0313 12:08:29.578408 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-f6gwd" event={"ID":"5d6d5bbe-7e5b-4645-95c4-af868cba3244","Type":"ContainerDied","Data":"5337a2212bdc3b1dbf150fa95afc9aaae420bfce797da10558e36cb08bd46c77"} Mar 13 12:08:30 crc kubenswrapper[4837]: I0313 12:08:30.950703 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.130722 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-combined-ca-bundle\") pod \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.130787 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqhf2\" (UniqueName: \"kubernetes.io/projected/5d6d5bbe-7e5b-4645-95c4-af868cba3244-kube-api-access-gqhf2\") pod \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.130846 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-scripts\") pod \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.131029 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-config-data\") pod \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.136992 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-scripts" (OuterVolumeSpecName: "scripts") pod "5d6d5bbe-7e5b-4645-95c4-af868cba3244" (UID: "5d6d5bbe-7e5b-4645-95c4-af868cba3244"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.151434 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d6d5bbe-7e5b-4645-95c4-af868cba3244-kube-api-access-gqhf2" (OuterVolumeSpecName: "kube-api-access-gqhf2") pod "5d6d5bbe-7e5b-4645-95c4-af868cba3244" (UID: "5d6d5bbe-7e5b-4645-95c4-af868cba3244"). InnerVolumeSpecName "kube-api-access-gqhf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.158509 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-config-data" (OuterVolumeSpecName: "config-data") pod "5d6d5bbe-7e5b-4645-95c4-af868cba3244" (UID: "5d6d5bbe-7e5b-4645-95c4-af868cba3244"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.161624 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d6d5bbe-7e5b-4645-95c4-af868cba3244" (UID: "5d6d5bbe-7e5b-4645-95c4-af868cba3244"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.233476 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.233523 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqhf2\" (UniqueName: \"kubernetes.io/projected/5d6d5bbe-7e5b-4645-95c4-af868cba3244-kube-api-access-gqhf2\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.233542 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.233555 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.528410 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.607025 4837 generic.go:334] "Generic (PLEG): container finished" podID="f9e01084-6025-433d-99d8-36d2c555c685" containerID="8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29" exitCode=0 Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.607094 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.607120 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9e01084-6025-433d-99d8-36d2c555c685","Type":"ContainerDied","Data":"8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29"} Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.607155 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9e01084-6025-433d-99d8-36d2c555c685","Type":"ContainerDied","Data":"f59c8e71a086a620d4220bad0b44420b5cdb08a3b6ad9da08898e9871162295e"} Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.607173 4837 scope.go:117] "RemoveContainer" containerID="49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.611269 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-f6gwd" event={"ID":"5d6d5bbe-7e5b-4645-95c4-af868cba3244","Type":"ContainerDied","Data":"6144ca86ef9d0d9f5e120027d710fea9eb400bcc8f2a208f56ef661ebbec1f34"} Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.611339 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6144ca86ef9d0d9f5e120027d710fea9eb400bcc8f2a208f56ef661ebbec1f34" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.611427 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.631035 4837 scope.go:117] "RemoveContainer" containerID="e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.643320 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-config-data\") pod \"f9e01084-6025-433d-99d8-36d2c555c685\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.643422 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-sg-core-conf-yaml\") pod \"f9e01084-6025-433d-99d8-36d2c555c685\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.643503 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-scripts\") pod \"f9e01084-6025-433d-99d8-36d2c555c685\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.643539 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9e01084-6025-433d-99d8-36d2c555c685-run-httpd\") pod \"f9e01084-6025-433d-99d8-36d2c555c685\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.643588 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w9zs\" (UniqueName: \"kubernetes.io/projected/f9e01084-6025-433d-99d8-36d2c555c685-kube-api-access-7w9zs\") pod \"f9e01084-6025-433d-99d8-36d2c555c685\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.643676 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-combined-ca-bundle\") pod \"f9e01084-6025-433d-99d8-36d2c555c685\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.643790 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9e01084-6025-433d-99d8-36d2c555c685-log-httpd\") pod \"f9e01084-6025-433d-99d8-36d2c555c685\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.643887 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9e01084-6025-433d-99d8-36d2c555c685-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f9e01084-6025-433d-99d8-36d2c555c685" (UID: "f9e01084-6025-433d-99d8-36d2c555c685"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.644260 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9e01084-6025-433d-99d8-36d2c555c685-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.644258 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9e01084-6025-433d-99d8-36d2c555c685-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f9e01084-6025-433d-99d8-36d2c555c685" (UID: "f9e01084-6025-433d-99d8-36d2c555c685"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.655955 4837 scope.go:117] "RemoveContainer" containerID="6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.658227 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9e01084-6025-433d-99d8-36d2c555c685-kube-api-access-7w9zs" (OuterVolumeSpecName: "kube-api-access-7w9zs") pod "f9e01084-6025-433d-99d8-36d2c555c685" (UID: "f9e01084-6025-433d-99d8-36d2c555c685"). InnerVolumeSpecName "kube-api-access-7w9zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.673741 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-scripts" (OuterVolumeSpecName: "scripts") pod "f9e01084-6025-433d-99d8-36d2c555c685" (UID: "f9e01084-6025-433d-99d8-36d2c555c685"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.691473 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f9e01084-6025-433d-99d8-36d2c555c685" (UID: "f9e01084-6025-433d-99d8-36d2c555c685"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.691498 4837 scope.go:117] "RemoveContainer" containerID="8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.707447 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 12:08:31 crc kubenswrapper[4837]: E0313 12:08:31.708328 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d6d5bbe-7e5b-4645-95c4-af868cba3244" containerName="nova-cell0-conductor-db-sync" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.708358 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d6d5bbe-7e5b-4645-95c4-af868cba3244" containerName="nova-cell0-conductor-db-sync" Mar 13 12:08:31 crc kubenswrapper[4837]: E0313 12:08:31.708394 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="ceilometer-central-agent" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.708403 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="ceilometer-central-agent" Mar 13 12:08:31 crc kubenswrapper[4837]: E0313 12:08:31.708415 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="proxy-httpd" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.708424 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="proxy-httpd" Mar 13 12:08:31 crc kubenswrapper[4837]: E0313 12:08:31.708438 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="sg-core" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.708446 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="sg-core" Mar 13 12:08:31 crc kubenswrapper[4837]: E0313 12:08:31.708464 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="ceilometer-notification-agent" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.708472 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="ceilometer-notification-agent" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.708701 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="sg-core" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.708715 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="ceilometer-notification-agent" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.708778 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="ceilometer-central-agent" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.708794 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d6d5bbe-7e5b-4645-95c4-af868cba3244" containerName="nova-cell0-conductor-db-sync" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.708808 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="proxy-httpd" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.709390 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.714091 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.714292 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qctwr" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.722361 4837 scope.go:117] "RemoveContainer" containerID="49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.724189 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 12:08:31 crc kubenswrapper[4837]: E0313 12:08:31.727252 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f\": container with ID starting with 49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f not found: ID does not exist" containerID="49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.727309 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f"} err="failed to get container status \"49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f\": rpc error: code = NotFound desc = could not find container \"49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f\": container with ID starting with 49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f not found: ID does not exist" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.727343 4837 scope.go:117] "RemoveContainer" containerID="e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00" Mar 13 12:08:31 crc kubenswrapper[4837]: E0313 12:08:31.729448 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00\": container with ID starting with e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00 not found: ID does not exist" containerID="e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.729524 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00"} err="failed to get container status \"e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00\": rpc error: code = NotFound desc = could not find container \"e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00\": container with ID starting with e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00 not found: ID does not exist" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.729590 4837 scope.go:117] "RemoveContainer" containerID="6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c" Mar 13 12:08:31 crc kubenswrapper[4837]: E0313 12:08:31.730280 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c\": container with ID starting with 6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c not found: ID does not exist" containerID="6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.730332 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c"} err="failed to get container status \"6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c\": rpc error: code = NotFound desc = could not find container \"6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c\": container with ID starting with 6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c not found: ID does not exist" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.730356 4837 scope.go:117] "RemoveContainer" containerID="8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29" Mar 13 12:08:31 crc kubenswrapper[4837]: E0313 12:08:31.730897 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29\": container with ID starting with 8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29 not found: ID does not exist" containerID="8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.730928 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29"} err="failed to get container status \"8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29\": rpc error: code = NotFound desc = could not find container \"8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29\": container with ID starting with 8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29 not found: ID does not exist" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.743045 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9e01084-6025-433d-99d8-36d2c555c685" (UID: "f9e01084-6025-433d-99d8-36d2c555c685"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.746113 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.746148 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.746161 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w9zs\" (UniqueName: \"kubernetes.io/projected/f9e01084-6025-433d-99d8-36d2c555c685-kube-api-access-7w9zs\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.746171 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.746179 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9e01084-6025-433d-99d8-36d2c555c685-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.764621 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-config-data" (OuterVolumeSpecName: "config-data") pod "f9e01084-6025-433d-99d8-36d2c555c685" (UID: "f9e01084-6025-433d-99d8-36d2c555c685"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.848205 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58240a84-c8ab-43a9-8113-eaf2d0ddea2e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"58240a84-c8ab-43a9-8113-eaf2d0ddea2e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.848279 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58240a84-c8ab-43a9-8113-eaf2d0ddea2e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"58240a84-c8ab-43a9-8113-eaf2d0ddea2e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.849126 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnx86\" (UniqueName: \"kubernetes.io/projected/58240a84-c8ab-43a9-8113-eaf2d0ddea2e-kube-api-access-jnx86\") pod \"nova-cell0-conductor-0\" (UID: \"58240a84-c8ab-43a9-8113-eaf2d0ddea2e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.849387 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.944363 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.951321 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58240a84-c8ab-43a9-8113-eaf2d0ddea2e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"58240a84-c8ab-43a9-8113-eaf2d0ddea2e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.951394 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58240a84-c8ab-43a9-8113-eaf2d0ddea2e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"58240a84-c8ab-43a9-8113-eaf2d0ddea2e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.951457 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnx86\" (UniqueName: \"kubernetes.io/projected/58240a84-c8ab-43a9-8113-eaf2d0ddea2e-kube-api-access-jnx86\") pod \"nova-cell0-conductor-0\" (UID: \"58240a84-c8ab-43a9-8113-eaf2d0ddea2e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.956674 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.960837 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58240a84-c8ab-43a9-8113-eaf2d0ddea2e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"58240a84-c8ab-43a9-8113-eaf2d0ddea2e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.973087 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnx86\" (UniqueName: \"kubernetes.io/projected/58240a84-c8ab-43a9-8113-eaf2d0ddea2e-kube-api-access-jnx86\") pod \"nova-cell0-conductor-0\" (UID: \"58240a84-c8ab-43a9-8113-eaf2d0ddea2e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.978830 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58240a84-c8ab-43a9-8113-eaf2d0ddea2e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"58240a84-c8ab-43a9-8113-eaf2d0ddea2e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.984802 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.987563 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.991152 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.991289 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:31.999107 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.035337 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.155530 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-config-data\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.155980 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzlqd\" (UniqueName: \"kubernetes.io/projected/a7f70330-cb87-42e5-96c8-6d54828f2a5a-kube-api-access-lzlqd\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.156024 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7f70330-cb87-42e5-96c8-6d54828f2a5a-log-httpd\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.156054 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-scripts\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.156288 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7f70330-cb87-42e5-96c8-6d54828f2a5a-run-httpd\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.156365 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.156410 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.258325 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-config-data\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.258379 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzlqd\" (UniqueName: \"kubernetes.io/projected/a7f70330-cb87-42e5-96c8-6d54828f2a5a-kube-api-access-lzlqd\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.258429 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7f70330-cb87-42e5-96c8-6d54828f2a5a-log-httpd\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.258457 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-scripts\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.258536 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7f70330-cb87-42e5-96c8-6d54828f2a5a-run-httpd\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.258574 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.258610 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.259078 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7f70330-cb87-42e5-96c8-6d54828f2a5a-log-httpd\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.259165 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7f70330-cb87-42e5-96c8-6d54828f2a5a-run-httpd\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.263759 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.264247 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-config-data\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.264702 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-scripts\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.268340 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.276699 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzlqd\" (UniqueName: \"kubernetes.io/projected/a7f70330-cb87-42e5-96c8-6d54828f2a5a-kube-api-access-lzlqd\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.459762 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.467262 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.622821 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"58240a84-c8ab-43a9-8113-eaf2d0ddea2e","Type":"ContainerStarted","Data":"ca3091bb68a4c8ccce552dcb4050de7d08cec268f979f37a313240951c2a5722"} Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.906093 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:32 crc kubenswrapper[4837]: W0313 12:08:32.906793 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7f70330_cb87_42e5_96c8_6d54828f2a5a.slice/crio-c1caae87e2bfbe9657e4b62036ebd200f6d2445955d6ab66a4adb47c94a2fae0 WatchSource:0}: Error finding container c1caae87e2bfbe9657e4b62036ebd200f6d2445955d6ab66a4adb47c94a2fae0: Status 404 returned error can't find the container with id c1caae87e2bfbe9657e4b62036ebd200f6d2445955d6ab66a4adb47c94a2fae0 Mar 13 12:08:33 crc kubenswrapper[4837]: I0313 12:08:33.063276 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9e01084-6025-433d-99d8-36d2c555c685" path="/var/lib/kubelet/pods/f9e01084-6025-433d-99d8-36d2c555c685/volumes" Mar 13 12:08:33 crc kubenswrapper[4837]: I0313 12:08:33.634661 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"58240a84-c8ab-43a9-8113-eaf2d0ddea2e","Type":"ContainerStarted","Data":"b6b42276aa11a1a7a9b37c345c43f7aaa45f27dcce528886b8a09316471865cf"} Mar 13 12:08:33 crc kubenswrapper[4837]: I0313 12:08:33.634993 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:33 crc kubenswrapper[4837]: I0313 12:08:33.637057 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7f70330-cb87-42e5-96c8-6d54828f2a5a","Type":"ContainerStarted","Data":"c1caae87e2bfbe9657e4b62036ebd200f6d2445955d6ab66a4adb47c94a2fae0"} Mar 13 12:08:33 crc kubenswrapper[4837]: I0313 12:08:33.677405 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.677377967 podStartE2EDuration="2.677377967s" podCreationTimestamp="2026-03-13 12:08:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:33.647666594 +0000 UTC m=+1229.285933367" watchObservedRunningTime="2026-03-13 12:08:33.677377967 +0000 UTC m=+1229.315644730" Mar 13 12:08:34 crc kubenswrapper[4837]: I0313 12:08:34.649553 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7f70330-cb87-42e5-96c8-6d54828f2a5a","Type":"ContainerStarted","Data":"d8b81a1d862c648975bd9a812fe1d61df727077dd39a97f4adfc70dac6066075"} Mar 13 12:08:35 crc kubenswrapper[4837]: I0313 12:08:35.484394 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:08:35 crc kubenswrapper[4837]: I0313 12:08:35.484483 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:08:35 crc kubenswrapper[4837]: I0313 12:08:35.484558 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 12:08:35 crc kubenswrapper[4837]: I0313 12:08:35.485755 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75c6e15833f1c4c6d83b741f42f4ce0c9378844641d1d149fd75349d257dfc71"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:08:35 crc kubenswrapper[4837]: I0313 12:08:35.485837 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://75c6e15833f1c4c6d83b741f42f4ce0c9378844641d1d149fd75349d257dfc71" gracePeriod=600 Mar 13 12:08:35 crc kubenswrapper[4837]: I0313 12:08:35.660922 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="75c6e15833f1c4c6d83b741f42f4ce0c9378844641d1d149fd75349d257dfc71" exitCode=0 Mar 13 12:08:35 crc kubenswrapper[4837]: I0313 12:08:35.661015 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"75c6e15833f1c4c6d83b741f42f4ce0c9378844641d1d149fd75349d257dfc71"} Mar 13 12:08:35 crc kubenswrapper[4837]: I0313 12:08:35.661318 4837 scope.go:117] "RemoveContainer" containerID="62df99fa64e257c350cea1390039e0bd2f2c672bf6d80836ec3df94beec3d8d1" Mar 13 12:08:35 crc kubenswrapper[4837]: I0313 12:08:35.664203 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7f70330-cb87-42e5-96c8-6d54828f2a5a","Type":"ContainerStarted","Data":"6d24d7cecf025123d4d281213efc8079b0cb18a3f100808ee593959500d93094"} Mar 13 12:08:36 crc kubenswrapper[4837]: I0313 12:08:36.488196 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.160:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 12:08:36 crc kubenswrapper[4837]: I0313 12:08:36.488249 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.160:9292/healthcheck\": context deadline exceeded" Mar 13 12:08:36 crc kubenswrapper[4837]: I0313 12:08:36.673530 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"1c6d9e7e9de5c8ffd75bcf8d5717605d713d8068815596e54e918770c94282bc"} Mar 13 12:08:36 crc kubenswrapper[4837]: I0313 12:08:36.676520 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7f70330-cb87-42e5-96c8-6d54828f2a5a","Type":"ContainerStarted","Data":"a9f4ef9baf51c5a45fe25c828b539addde1c0065712a676f95056b2183f00569"} Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.088620 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.569984 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-xlps2"] Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.571980 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.576128 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.576880 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.585099 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xlps2"] Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.672994 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-config-data\") pod \"nova-cell0-cell-mapping-xlps2\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.673129 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-scripts\") pod \"nova-cell0-cell-mapping-xlps2\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.673167 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fltcz\" (UniqueName: \"kubernetes.io/projected/53268342-9adb-48b3-ba5b-52634c2c68fe-kube-api-access-fltcz\") pod \"nova-cell0-cell-mapping-xlps2\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.673273 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xlps2\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.729054 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.730572 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.741538 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.748558 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.776548 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-config-data\") pod \"nova-cell0-cell-mapping-xlps2\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.776694 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-scripts\") pod \"nova-cell0-cell-mapping-xlps2\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.776723 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fltcz\" (UniqueName: \"kubernetes.io/projected/53268342-9adb-48b3-ba5b-52634c2c68fe-kube-api-access-fltcz\") pod \"nova-cell0-cell-mapping-xlps2\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.776818 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xlps2\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.793398 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-scripts\") pod \"nova-cell0-cell-mapping-xlps2\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.793709 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xlps2\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.803054 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-config-data\") pod \"nova-cell0-cell-mapping-xlps2\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.815229 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.816607 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.818396 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fltcz\" (UniqueName: \"kubernetes.io/projected/53268342-9adb-48b3-ba5b-52634c2c68fe-kube-api-access-fltcz\") pod \"nova-cell0-cell-mapping-xlps2\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.820257 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.824204 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.878785 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ec286a-b6df-4462-8023-c01230a50793-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"81ec286a-b6df-4462-8023-c01230a50793\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.878907 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81ec286a-b6df-4462-8023-c01230a50793-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"81ec286a-b6df-4462-8023-c01230a50793\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.879094 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srf5b\" (UniqueName: \"kubernetes.io/projected/81ec286a-b6df-4462-8023-c01230a50793-kube-api-access-srf5b\") pod \"nova-cell1-novncproxy-0\" (UID: \"81ec286a-b6df-4462-8023-c01230a50793\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.897149 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.932299 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.933711 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.948432 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.980333 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707289ff-1434-49b7-904a-58decfdd53ca-config-data\") pod \"nova-scheduler-0\" (UID: \"707289ff-1434-49b7-904a-58decfdd53ca\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.980437 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srf5b\" (UniqueName: \"kubernetes.io/projected/81ec286a-b6df-4462-8023-c01230a50793-kube-api-access-srf5b\") pod \"nova-cell1-novncproxy-0\" (UID: \"81ec286a-b6df-4462-8023-c01230a50793\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.980468 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ec286a-b6df-4462-8023-c01230a50793-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"81ec286a-b6df-4462-8023-c01230a50793\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.980546 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81ec286a-b6df-4462-8023-c01230a50793-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"81ec286a-b6df-4462-8023-c01230a50793\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.980623 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707289ff-1434-49b7-904a-58decfdd53ca-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"707289ff-1434-49b7-904a-58decfdd53ca\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.980715 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzqjg\" (UniqueName: \"kubernetes.io/projected/707289ff-1434-49b7-904a-58decfdd53ca-kube-api-access-jzqjg\") pod \"nova-scheduler-0\" (UID: \"707289ff-1434-49b7-904a-58decfdd53ca\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.988591 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ec286a-b6df-4462-8023-c01230a50793-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"81ec286a-b6df-4462-8023-c01230a50793\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.996369 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81ec286a-b6df-4462-8023-c01230a50793-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"81ec286a-b6df-4462-8023-c01230a50793\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.996452 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.029111 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srf5b\" (UniqueName: \"kubernetes.io/projected/81ec286a-b6df-4462-8023-c01230a50793-kube-api-access-srf5b\") pod \"nova-cell1-novncproxy-0\" (UID: \"81ec286a-b6df-4462-8023-c01230a50793\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.063207 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.086493 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85cfl\" (UniqueName: \"kubernetes.io/projected/f3179576-07e2-4e05-8d10-01e3d694863b-kube-api-access-85cfl\") pod \"nova-api-0\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.086603 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3179576-07e2-4e05-8d10-01e3d694863b-config-data\") pod \"nova-api-0\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.086683 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707289ff-1434-49b7-904a-58decfdd53ca-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"707289ff-1434-49b7-904a-58decfdd53ca\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.086729 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3179576-07e2-4e05-8d10-01e3d694863b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.086762 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3179576-07e2-4e05-8d10-01e3d694863b-logs\") pod \"nova-api-0\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.086788 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzqjg\" (UniqueName: \"kubernetes.io/projected/707289ff-1434-49b7-904a-58decfdd53ca-kube-api-access-jzqjg\") pod \"nova-scheduler-0\" (UID: \"707289ff-1434-49b7-904a-58decfdd53ca\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.086834 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707289ff-1434-49b7-904a-58decfdd53ca-config-data\") pod \"nova-scheduler-0\" (UID: \"707289ff-1434-49b7-904a-58decfdd53ca\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.091622 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707289ff-1434-49b7-904a-58decfdd53ca-config-data\") pod \"nova-scheduler-0\" (UID: \"707289ff-1434-49b7-904a-58decfdd53ca\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.097940 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707289ff-1434-49b7-904a-58decfdd53ca-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"707289ff-1434-49b7-904a-58decfdd53ca\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.111833 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.113597 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.127468 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.184630 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.198479 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzqjg\" (UniqueName: \"kubernetes.io/projected/707289ff-1434-49b7-904a-58decfdd53ca-kube-api-access-jzqjg\") pod \"nova-scheduler-0\" (UID: \"707289ff-1434-49b7-904a-58decfdd53ca\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.199222 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820f49e8-5f60-46ba-80a8-6314d4ae2c48-config-data\") pod \"nova-metadata-0\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.199265 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gkl7\" (UniqueName: \"kubernetes.io/projected/820f49e8-5f60-46ba-80a8-6314d4ae2c48-kube-api-access-6gkl7\") pod \"nova-metadata-0\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.199318 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/820f49e8-5f60-46ba-80a8-6314d4ae2c48-logs\") pod \"nova-metadata-0\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.199436 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820f49e8-5f60-46ba-80a8-6314d4ae2c48-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.199793 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85cfl\" (UniqueName: \"kubernetes.io/projected/f3179576-07e2-4e05-8d10-01e3d694863b-kube-api-access-85cfl\") pod \"nova-api-0\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.200593 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3179576-07e2-4e05-8d10-01e3d694863b-config-data\") pod \"nova-api-0\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.200964 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3179576-07e2-4e05-8d10-01e3d694863b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.201030 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3179576-07e2-4e05-8d10-01e3d694863b-logs\") pod \"nova-api-0\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.201520 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3179576-07e2-4e05-8d10-01e3d694863b-logs\") pod \"nova-api-0\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.218331 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3179576-07e2-4e05-8d10-01e3d694863b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.220301 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3179576-07e2-4e05-8d10-01e3d694863b-config-data\") pod \"nova-api-0\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.240261 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85cfl\" (UniqueName: \"kubernetes.io/projected/f3179576-07e2-4e05-8d10-01e3d694863b-kube-api-access-85cfl\") pod \"nova-api-0\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.303875 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820f49e8-5f60-46ba-80a8-6314d4ae2c48-config-data\") pod \"nova-metadata-0\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.303918 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gkl7\" (UniqueName: \"kubernetes.io/projected/820f49e8-5f60-46ba-80a8-6314d4ae2c48-kube-api-access-6gkl7\") pod \"nova-metadata-0\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.303943 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/820f49e8-5f60-46ba-80a8-6314d4ae2c48-logs\") pod \"nova-metadata-0\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.303984 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820f49e8-5f60-46ba-80a8-6314d4ae2c48-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.311439 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/820f49e8-5f60-46ba-80a8-6314d4ae2c48-logs\") pod \"nova-metadata-0\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.313691 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820f49e8-5f60-46ba-80a8-6314d4ae2c48-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.315349 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820f49e8-5f60-46ba-80a8-6314d4ae2c48-config-data\") pod \"nova-metadata-0\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.345814 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gkl7\" (UniqueName: \"kubernetes.io/projected/820f49e8-5f60-46ba-80a8-6314d4ae2c48-kube-api-access-6gkl7\") pod \"nova-metadata-0\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.347032 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-5blpv"] Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.348501 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.385702 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-5blpv"] Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.419770 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.463077 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.492091 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.509950 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.510015 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gvxg\" (UniqueName: \"kubernetes.io/projected/6de330b6-0bbb-4a9d-9062-9c7ed182a189-kube-api-access-7gvxg\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.510072 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-dns-svc\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.510099 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-config\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.510251 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.510306 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.611562 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.611918 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.611980 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.612008 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gvxg\" (UniqueName: \"kubernetes.io/projected/6de330b6-0bbb-4a9d-9062-9c7ed182a189-kube-api-access-7gvxg\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.612058 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-dns-svc\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.612090 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-config\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.613145 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-config\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.613837 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.614372 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.614491 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.615125 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-dns-svc\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.648588 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gvxg\" (UniqueName: \"kubernetes.io/projected/6de330b6-0bbb-4a9d-9062-9c7ed182a189-kube-api-access-7gvxg\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.696136 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.758353 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7f70330-cb87-42e5-96c8-6d54828f2a5a","Type":"ContainerStarted","Data":"94da49d6a7255e5847d10069ee75dd614b4c6eea7e080a518814f780623556e5"} Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.759593 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.785409 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xlps2"] Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.796769 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.407149768 podStartE2EDuration="7.796750091s" podCreationTimestamp="2026-03-13 12:08:31 +0000 UTC" firstStartedPulling="2026-03-13 12:08:32.909031394 +0000 UTC m=+1228.547298177" lastFinishedPulling="2026-03-13 12:08:38.298631737 +0000 UTC m=+1233.936898500" observedRunningTime="2026-03-13 12:08:38.785744094 +0000 UTC m=+1234.424010857" watchObservedRunningTime="2026-03-13 12:08:38.796750091 +0000 UTC m=+1234.435016844" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.046928 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.137949 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.229301 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8mzt4"] Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.231680 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.242866 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.242922 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.274134 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8mzt4"] Mar 13 12:08:39 crc kubenswrapper[4837]: W0313 12:08:39.294984 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod707289ff_1434_49b7_904a_58decfdd53ca.slice/crio-b6785bc0ca408832d28cd32714ea145d9a6e0bbc829424d2cf876cff8cb2427b WatchSource:0}: Error finding container b6785bc0ca408832d28cd32714ea145d9a6e0bbc829424d2cf876cff8cb2427b: Status 404 returned error can't find the container with id b6785bc0ca408832d28cd32714ea145d9a6e0bbc829424d2cf876cff8cb2427b Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.301625 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:08:39 crc kubenswrapper[4837]: W0313 12:08:39.308950 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod820f49e8_5f60_46ba_80a8_6314d4ae2c48.slice/crio-96afc52a22b33bcafbbfbaef6c12e2832f3e29442fefe3b0c048dab79efe407f WatchSource:0}: Error finding container 96afc52a22b33bcafbbfbaef6c12e2832f3e29442fefe3b0c048dab79efe407f: Status 404 returned error can't find the container with id 96afc52a22b33bcafbbfbaef6c12e2832f3e29442fefe3b0c048dab79efe407f Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.338266 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.365681 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8mzt4\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.365741 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kwkc\" (UniqueName: \"kubernetes.io/projected/02b82791-6ef3-4a93-9d5a-84065d62775d-kube-api-access-8kwkc\") pod \"nova-cell1-conductor-db-sync-8mzt4\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.365947 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-config-data\") pod \"nova-cell1-conductor-db-sync-8mzt4\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.366029 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-scripts\") pod \"nova-cell1-conductor-db-sync-8mzt4\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.409857 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-5blpv"] Mar 13 12:08:39 crc kubenswrapper[4837]: W0313 12:08:39.411963 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6de330b6_0bbb_4a9d_9062_9c7ed182a189.slice/crio-d820b1edec0c5d2d420936bab95dffbf9bd4c7adef7db33d312ced4b311526ff WatchSource:0}: Error finding container d820b1edec0c5d2d420936bab95dffbf9bd4c7adef7db33d312ced4b311526ff: Status 404 returned error can't find the container with id d820b1edec0c5d2d420936bab95dffbf9bd4c7adef7db33d312ced4b311526ff Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.467722 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-config-data\") pod \"nova-cell1-conductor-db-sync-8mzt4\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.468403 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-scripts\") pod \"nova-cell1-conductor-db-sync-8mzt4\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.468541 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8mzt4\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.468712 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kwkc\" (UniqueName: \"kubernetes.io/projected/02b82791-6ef3-4a93-9d5a-84065d62775d-kube-api-access-8kwkc\") pod \"nova-cell1-conductor-db-sync-8mzt4\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.471377 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-config-data\") pod \"nova-cell1-conductor-db-sync-8mzt4\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.473323 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8mzt4\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.473935 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-scripts\") pod \"nova-cell1-conductor-db-sync-8mzt4\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.488350 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kwkc\" (UniqueName: \"kubernetes.io/projected/02b82791-6ef3-4a93-9d5a-84065d62775d-kube-api-access-8kwkc\") pod \"nova-cell1-conductor-db-sync-8mzt4\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.562759 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.786595 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xlps2" event={"ID":"53268342-9adb-48b3-ba5b-52634c2c68fe","Type":"ContainerStarted","Data":"5e2fe1dde876f5e43e3e8ce2528c539e8504cc8726824d4a38da88b3f10df140"} Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.786923 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xlps2" event={"ID":"53268342-9adb-48b3-ba5b-52634c2c68fe","Type":"ContainerStarted","Data":"4965c8cb939b3555313bf5ff81aac80d2a106589cf591b13befb28e52d15f3d4"} Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.788369 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"81ec286a-b6df-4462-8023-c01230a50793","Type":"ContainerStarted","Data":"78f61644c1756b2a1acf80d548b16d064b0de263e518cd87a1b42cea8c63088a"} Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.789454 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"820f49e8-5f60-46ba-80a8-6314d4ae2c48","Type":"ContainerStarted","Data":"96afc52a22b33bcafbbfbaef6c12e2832f3e29442fefe3b0c048dab79efe407f"} Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.796173 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3179576-07e2-4e05-8d10-01e3d694863b","Type":"ContainerStarted","Data":"462ea19aaa8a4b2f42cf4a80e03784c4432ff7806e973f2c0cf7363762b9df8e"} Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.801537 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-xlps2" podStartSLOduration=2.801520946 podStartE2EDuration="2.801520946s" podCreationTimestamp="2026-03-13 12:08:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:39.800748892 +0000 UTC m=+1235.439015675" watchObservedRunningTime="2026-03-13 12:08:39.801520946 +0000 UTC m=+1235.439787709" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.803935 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"707289ff-1434-49b7-904a-58decfdd53ca","Type":"ContainerStarted","Data":"b6785bc0ca408832d28cd32714ea145d9a6e0bbc829424d2cf876cff8cb2427b"} Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.810014 4837 generic.go:334] "Generic (PLEG): container finished" podID="6de330b6-0bbb-4a9d-9062-9c7ed182a189" containerID="7468313c118293c73f68950b41a915eb07c6510dd5985dea1ec55106483d1ae9" exitCode=0 Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.810415 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" event={"ID":"6de330b6-0bbb-4a9d-9062-9c7ed182a189","Type":"ContainerDied","Data":"7468313c118293c73f68950b41a915eb07c6510dd5985dea1ec55106483d1ae9"} Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.810507 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" event={"ID":"6de330b6-0bbb-4a9d-9062-9c7ed182a189","Type":"ContainerStarted","Data":"d820b1edec0c5d2d420936bab95dffbf9bd4c7adef7db33d312ced4b311526ff"} Mar 13 12:08:40 crc kubenswrapper[4837]: I0313 12:08:40.107957 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8mzt4"] Mar 13 12:08:40 crc kubenswrapper[4837]: I0313 12:08:40.835340 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" event={"ID":"6de330b6-0bbb-4a9d-9062-9c7ed182a189","Type":"ContainerStarted","Data":"1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc"} Mar 13 12:08:40 crc kubenswrapper[4837]: I0313 12:08:40.835850 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:40 crc kubenswrapper[4837]: I0313 12:08:40.846349 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8mzt4" event={"ID":"02b82791-6ef3-4a93-9d5a-84065d62775d","Type":"ContainerStarted","Data":"deea73f54571ed1f4517906256e112c93e642ebacb77d1a62a53b5217eb1d25c"} Mar 13 12:08:40 crc kubenswrapper[4837]: I0313 12:08:40.846418 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8mzt4" event={"ID":"02b82791-6ef3-4a93-9d5a-84065d62775d","Type":"ContainerStarted","Data":"8e78351e0267ed6011c0c508a5d86a34c1efe986a8b326b91fdf940d351283e1"} Mar 13 12:08:40 crc kubenswrapper[4837]: I0313 12:08:40.862464 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" podStartSLOduration=2.86244602 podStartE2EDuration="2.86244602s" podCreationTimestamp="2026-03-13 12:08:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:40.855537942 +0000 UTC m=+1236.493804705" watchObservedRunningTime="2026-03-13 12:08:40.86244602 +0000 UTC m=+1236.500712783" Mar 13 12:08:40 crc kubenswrapper[4837]: I0313 12:08:40.886859 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-8mzt4" podStartSLOduration=1.8868358779999999 podStartE2EDuration="1.886835878s" podCreationTimestamp="2026-03-13 12:08:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:40.873997364 +0000 UTC m=+1236.512264127" watchObservedRunningTime="2026-03-13 12:08:40.886835878 +0000 UTC m=+1236.525102641" Mar 13 12:08:41 crc kubenswrapper[4837]: I0313 12:08:41.822416 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:08:41 crc kubenswrapper[4837]: I0313 12:08:41.832333 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:42 crc kubenswrapper[4837]: I0313 12:08:42.877699 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"81ec286a-b6df-4462-8023-c01230a50793","Type":"ContainerStarted","Data":"2b72c4b74ac632994ae39578139216d840009de89378dfe0823503769ad992b6"} Mar 13 12:08:42 crc kubenswrapper[4837]: I0313 12:08:42.877905 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="81ec286a-b6df-4462-8023-c01230a50793" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://2b72c4b74ac632994ae39578139216d840009de89378dfe0823503769ad992b6" gracePeriod=30 Mar 13 12:08:42 crc kubenswrapper[4837]: I0313 12:08:42.881165 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"820f49e8-5f60-46ba-80a8-6314d4ae2c48","Type":"ContainerStarted","Data":"e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6"} Mar 13 12:08:42 crc kubenswrapper[4837]: I0313 12:08:42.883219 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3179576-07e2-4e05-8d10-01e3d694863b","Type":"ContainerStarted","Data":"0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397"} Mar 13 12:08:42 crc kubenswrapper[4837]: I0313 12:08:42.897493 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"707289ff-1434-49b7-904a-58decfdd53ca","Type":"ContainerStarted","Data":"de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089"} Mar 13 12:08:42 crc kubenswrapper[4837]: I0313 12:08:42.903276 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.5364554310000003 podStartE2EDuration="5.903258027s" podCreationTimestamp="2026-03-13 12:08:37 +0000 UTC" firstStartedPulling="2026-03-13 12:08:39.14038959 +0000 UTC m=+1234.778656353" lastFinishedPulling="2026-03-13 12:08:42.507192186 +0000 UTC m=+1238.145458949" observedRunningTime="2026-03-13 12:08:42.89700995 +0000 UTC m=+1238.535276723" watchObservedRunningTime="2026-03-13 12:08:42.903258027 +0000 UTC m=+1238.541524790" Mar 13 12:08:42 crc kubenswrapper[4837]: I0313 12:08:42.921118 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.707180286 podStartE2EDuration="5.921097688s" podCreationTimestamp="2026-03-13 12:08:37 +0000 UTC" firstStartedPulling="2026-03-13 12:08:39.297300101 +0000 UTC m=+1234.935566864" lastFinishedPulling="2026-03-13 12:08:42.511217503 +0000 UTC m=+1238.149484266" observedRunningTime="2026-03-13 12:08:42.913195129 +0000 UTC m=+1238.551461892" watchObservedRunningTime="2026-03-13 12:08:42.921097688 +0000 UTC m=+1238.559364451" Mar 13 12:08:43 crc kubenswrapper[4837]: I0313 12:08:43.064094 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:08:43 crc kubenswrapper[4837]: I0313 12:08:43.492610 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 12:08:43 crc kubenswrapper[4837]: I0313 12:08:43.908552 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"820f49e8-5f60-46ba-80a8-6314d4ae2c48","Type":"ContainerStarted","Data":"840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7"} Mar 13 12:08:43 crc kubenswrapper[4837]: I0313 12:08:43.908770 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="820f49e8-5f60-46ba-80a8-6314d4ae2c48" containerName="nova-metadata-log" containerID="cri-o://e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6" gracePeriod=30 Mar 13 12:08:43 crc kubenswrapper[4837]: I0313 12:08:43.909407 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="820f49e8-5f60-46ba-80a8-6314d4ae2c48" containerName="nova-metadata-metadata" containerID="cri-o://840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7" gracePeriod=30 Mar 13 12:08:43 crc kubenswrapper[4837]: I0313 12:08:43.915852 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3179576-07e2-4e05-8d10-01e3d694863b","Type":"ContainerStarted","Data":"ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a"} Mar 13 12:08:43 crc kubenswrapper[4837]: I0313 12:08:43.932525 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.702636859 podStartE2EDuration="5.932509243s" podCreationTimestamp="2026-03-13 12:08:38 +0000 UTC" firstStartedPulling="2026-03-13 12:08:39.311275861 +0000 UTC m=+1234.949542624" lastFinishedPulling="2026-03-13 12:08:42.541148245 +0000 UTC m=+1238.179415008" observedRunningTime="2026-03-13 12:08:43.925607216 +0000 UTC m=+1239.563873979" watchObservedRunningTime="2026-03-13 12:08:43.932509243 +0000 UTC m=+1239.570775996" Mar 13 12:08:43 crc kubenswrapper[4837]: I0313 12:08:43.955357 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.511674426 podStartE2EDuration="6.955331382s" podCreationTimestamp="2026-03-13 12:08:37 +0000 UTC" firstStartedPulling="2026-03-13 12:08:39.068331071 +0000 UTC m=+1234.706597834" lastFinishedPulling="2026-03-13 12:08:42.511988017 +0000 UTC m=+1238.150254790" observedRunningTime="2026-03-13 12:08:43.949616512 +0000 UTC m=+1239.587883285" watchObservedRunningTime="2026-03-13 12:08:43.955331382 +0000 UTC m=+1239.593598145" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.482735 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.583421 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820f49e8-5f60-46ba-80a8-6314d4ae2c48-combined-ca-bundle\") pod \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.583501 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820f49e8-5f60-46ba-80a8-6314d4ae2c48-config-data\") pod \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.583561 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gkl7\" (UniqueName: \"kubernetes.io/projected/820f49e8-5f60-46ba-80a8-6314d4ae2c48-kube-api-access-6gkl7\") pod \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.583769 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/820f49e8-5f60-46ba-80a8-6314d4ae2c48-logs\") pod \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.584155 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/820f49e8-5f60-46ba-80a8-6314d4ae2c48-logs" (OuterVolumeSpecName: "logs") pod "820f49e8-5f60-46ba-80a8-6314d4ae2c48" (UID: "820f49e8-5f60-46ba-80a8-6314d4ae2c48"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.584276 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/820f49e8-5f60-46ba-80a8-6314d4ae2c48-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.593913 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/820f49e8-5f60-46ba-80a8-6314d4ae2c48-kube-api-access-6gkl7" (OuterVolumeSpecName: "kube-api-access-6gkl7") pod "820f49e8-5f60-46ba-80a8-6314d4ae2c48" (UID: "820f49e8-5f60-46ba-80a8-6314d4ae2c48"). InnerVolumeSpecName "kube-api-access-6gkl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.609884 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/820f49e8-5f60-46ba-80a8-6314d4ae2c48-config-data" (OuterVolumeSpecName: "config-data") pod "820f49e8-5f60-46ba-80a8-6314d4ae2c48" (UID: "820f49e8-5f60-46ba-80a8-6314d4ae2c48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.612398 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/820f49e8-5f60-46ba-80a8-6314d4ae2c48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "820f49e8-5f60-46ba-80a8-6314d4ae2c48" (UID: "820f49e8-5f60-46ba-80a8-6314d4ae2c48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.696848 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820f49e8-5f60-46ba-80a8-6314d4ae2c48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.696907 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820f49e8-5f60-46ba-80a8-6314d4ae2c48-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.696921 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gkl7\" (UniqueName: \"kubernetes.io/projected/820f49e8-5f60-46ba-80a8-6314d4ae2c48-kube-api-access-6gkl7\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.928150 4837 generic.go:334] "Generic (PLEG): container finished" podID="820f49e8-5f60-46ba-80a8-6314d4ae2c48" containerID="840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7" exitCode=0 Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.928192 4837 generic.go:334] "Generic (PLEG): container finished" podID="820f49e8-5f60-46ba-80a8-6314d4ae2c48" containerID="e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6" exitCode=143 Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.928241 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"820f49e8-5f60-46ba-80a8-6314d4ae2c48","Type":"ContainerDied","Data":"840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7"} Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.928258 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.928283 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"820f49e8-5f60-46ba-80a8-6314d4ae2c48","Type":"ContainerDied","Data":"e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6"} Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.928295 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"820f49e8-5f60-46ba-80a8-6314d4ae2c48","Type":"ContainerDied","Data":"96afc52a22b33bcafbbfbaef6c12e2832f3e29442fefe3b0c048dab79efe407f"} Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.928313 4837 scope.go:117] "RemoveContainer" containerID="840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.975545 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.986490 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.996351 4837 scope.go:117] "RemoveContainer" containerID="e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.996407 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:44 crc kubenswrapper[4837]: E0313 12:08:44.996823 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820f49e8-5f60-46ba-80a8-6314d4ae2c48" containerName="nova-metadata-metadata" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.996835 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="820f49e8-5f60-46ba-80a8-6314d4ae2c48" containerName="nova-metadata-metadata" Mar 13 12:08:44 crc kubenswrapper[4837]: E0313 12:08:44.996864 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820f49e8-5f60-46ba-80a8-6314d4ae2c48" containerName="nova-metadata-log" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.996871 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="820f49e8-5f60-46ba-80a8-6314d4ae2c48" containerName="nova-metadata-log" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.997056 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="820f49e8-5f60-46ba-80a8-6314d4ae2c48" containerName="nova-metadata-metadata" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.997080 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="820f49e8-5f60-46ba-80a8-6314d4ae2c48" containerName="nova-metadata-log" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.998143 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.000426 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.000600 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.005210 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.048810 4837 scope.go:117] "RemoveContainer" containerID="840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7" Mar 13 12:08:45 crc kubenswrapper[4837]: E0313 12:08:45.052884 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7\": container with ID starting with 840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7 not found: ID does not exist" containerID="840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.052940 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7"} err="failed to get container status \"840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7\": rpc error: code = NotFound desc = could not find container \"840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7\": container with ID starting with 840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7 not found: ID does not exist" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.052971 4837 scope.go:117] "RemoveContainer" containerID="e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6" Mar 13 12:08:45 crc kubenswrapper[4837]: E0313 12:08:45.053359 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6\": container with ID starting with e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6 not found: ID does not exist" containerID="e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.053396 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6"} err="failed to get container status \"e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6\": rpc error: code = NotFound desc = could not find container \"e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6\": container with ID starting with e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6 not found: ID does not exist" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.053423 4837 scope.go:117] "RemoveContainer" containerID="840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.054124 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7"} err="failed to get container status \"840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7\": rpc error: code = NotFound desc = could not find container \"840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7\": container with ID starting with 840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7 not found: ID does not exist" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.054148 4837 scope.go:117] "RemoveContainer" containerID="e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.054964 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6"} err="failed to get container status \"e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6\": rpc error: code = NotFound desc = could not find container \"e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6\": container with ID starting with e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6 not found: ID does not exist" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.062680 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="820f49e8-5f60-46ba-80a8-6314d4ae2c48" path="/var/lib/kubelet/pods/820f49e8-5f60-46ba-80a8-6314d4ae2c48/volumes" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.107843 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.107956 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-logs\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.108022 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hzrv\" (UniqueName: \"kubernetes.io/projected/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-kube-api-access-5hzrv\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.108059 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-config-data\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.108075 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: E0313 12:08:45.208196 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod820f49e8_5f60_46ba_80a8_6314d4ae2c48.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.210122 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-config-data\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.210260 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.210430 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.210653 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-logs\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.210817 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hzrv\" (UniqueName: \"kubernetes.io/projected/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-kube-api-access-5hzrv\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.211925 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-logs\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.215211 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-config-data\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.215436 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.215721 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.232536 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hzrv\" (UniqueName: \"kubernetes.io/projected/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-kube-api-access-5hzrv\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.324355 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.766541 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:45 crc kubenswrapper[4837]: W0313 12:08:45.777100 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d0251c8_3594_482e_bd3c_2ca33c9e0ab5.slice/crio-e3827fca94d00b12ca92bacbca73cd8bbf6d6f767ab5ac87612ce11ca72155b4 WatchSource:0}: Error finding container e3827fca94d00b12ca92bacbca73cd8bbf6d6f767ab5ac87612ce11ca72155b4: Status 404 returned error can't find the container with id e3827fca94d00b12ca92bacbca73cd8bbf6d6f767ab5ac87612ce11ca72155b4 Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.941907 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5","Type":"ContainerStarted","Data":"e3827fca94d00b12ca92bacbca73cd8bbf6d6f767ab5ac87612ce11ca72155b4"} Mar 13 12:08:46 crc kubenswrapper[4837]: I0313 12:08:46.956483 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5","Type":"ContainerStarted","Data":"3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a"} Mar 13 12:08:46 crc kubenswrapper[4837]: I0313 12:08:46.957042 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5","Type":"ContainerStarted","Data":"2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495"} Mar 13 12:08:46 crc kubenswrapper[4837]: I0313 12:08:46.980782 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.980764639 podStartE2EDuration="2.980764639s" podCreationTimestamp="2026-03-13 12:08:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:46.972816619 +0000 UTC m=+1242.611083402" watchObservedRunningTime="2026-03-13 12:08:46.980764639 +0000 UTC m=+1242.619031402" Mar 13 12:08:47 crc kubenswrapper[4837]: I0313 12:08:47.967335 4837 generic.go:334] "Generic (PLEG): container finished" podID="53268342-9adb-48b3-ba5b-52634c2c68fe" containerID="5e2fe1dde876f5e43e3e8ce2528c539e8504cc8726824d4a38da88b3f10df140" exitCode=0 Mar 13 12:08:47 crc kubenswrapper[4837]: I0313 12:08:47.967423 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xlps2" event={"ID":"53268342-9adb-48b3-ba5b-52634c2c68fe","Type":"ContainerDied","Data":"5e2fe1dde876f5e43e3e8ce2528c539e8504cc8726824d4a38da88b3f10df140"} Mar 13 12:08:47 crc kubenswrapper[4837]: I0313 12:08:47.969713 4837 generic.go:334] "Generic (PLEG): container finished" podID="02b82791-6ef3-4a93-9d5a-84065d62775d" containerID="deea73f54571ed1f4517906256e112c93e642ebacb77d1a62a53b5217eb1d25c" exitCode=0 Mar 13 12:08:47 crc kubenswrapper[4837]: I0313 12:08:47.969754 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8mzt4" event={"ID":"02b82791-6ef3-4a93-9d5a-84065d62775d","Type":"ContainerDied","Data":"deea73f54571ed1f4517906256e112c93e642ebacb77d1a62a53b5217eb1d25c"} Mar 13 12:08:48 crc kubenswrapper[4837]: I0313 12:08:48.421142 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 12:08:48 crc kubenswrapper[4837]: I0313 12:08:48.421582 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 12:08:48 crc kubenswrapper[4837]: I0313 12:08:48.493171 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 12:08:48 crc kubenswrapper[4837]: I0313 12:08:48.521397 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 12:08:48 crc kubenswrapper[4837]: I0313 12:08:48.697880 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:48 crc kubenswrapper[4837]: I0313 12:08:48.757219 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-txzkw"] Mar 13 12:08:48 crc kubenswrapper[4837]: I0313 12:08:48.757453 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" podUID="fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" containerName="dnsmasq-dns" containerID="cri-o://308f5a2ca30c72015ad1831a239549e973a6a698921b4916b0e838cdf0b49c8a" gracePeriod=10 Mar 13 12:08:48 crc kubenswrapper[4837]: I0313 12:08:48.983559 4837 generic.go:334] "Generic (PLEG): container finished" podID="fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" containerID="308f5a2ca30c72015ad1831a239549e973a6a698921b4916b0e838cdf0b49c8a" exitCode=0 Mar 13 12:08:48 crc kubenswrapper[4837]: I0313 12:08:48.984390 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" event={"ID":"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b","Type":"ContainerDied","Data":"308f5a2ca30c72015ad1831a239549e973a6a698921b4916b0e838cdf0b49c8a"} Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.041241 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.353885 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.503315 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-ovsdbserver-sb\") pod \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.503402 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2wh2\" (UniqueName: \"kubernetes.io/projected/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-kube-api-access-s2wh2\") pod \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.503485 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-config\") pod \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.503585 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-dns-svc\") pod \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.503613 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-ovsdbserver-nb\") pod \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.503771 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-dns-swift-storage-0\") pod \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.505423 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f3179576-07e2-4e05-8d10-01e3d694863b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.508445 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f3179576-07e2-4e05-8d10-01e3d694863b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.515671 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-kube-api-access-s2wh2" (OuterVolumeSpecName: "kube-api-access-s2wh2") pod "fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" (UID: "fd9a8546-e61b-47e0-90b9-e6c8e4365b0b"). InnerVolumeSpecName "kube-api-access-s2wh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.555328 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.574209 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" (UID: "fd9a8546-e61b-47e0-90b9-e6c8e4365b0b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.583376 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.587965 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" (UID: "fd9a8546-e61b-47e0-90b9-e6c8e4365b0b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.596490 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-config" (OuterVolumeSpecName: "config") pod "fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" (UID: "fd9a8546-e61b-47e0-90b9-e6c8e4365b0b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.609682 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fltcz\" (UniqueName: \"kubernetes.io/projected/53268342-9adb-48b3-ba5b-52634c2c68fe-kube-api-access-fltcz\") pod \"53268342-9adb-48b3-ba5b-52634c2c68fe\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.609778 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-config-data\") pod \"53268342-9adb-48b3-ba5b-52634c2c68fe\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.609816 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kwkc\" (UniqueName: \"kubernetes.io/projected/02b82791-6ef3-4a93-9d5a-84065d62775d-kube-api-access-8kwkc\") pod \"02b82791-6ef3-4a93-9d5a-84065d62775d\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.609902 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-combined-ca-bundle\") pod \"53268342-9adb-48b3-ba5b-52634c2c68fe\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.609957 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-scripts\") pod \"53268342-9adb-48b3-ba5b-52634c2c68fe\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.609988 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" (UID: "fd9a8546-e61b-47e0-90b9-e6c8e4365b0b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.610058 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-config-data\") pod \"02b82791-6ef3-4a93-9d5a-84065d62775d\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.610093 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-scripts\") pod \"02b82791-6ef3-4a93-9d5a-84065d62775d\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.610169 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-combined-ca-bundle\") pod \"02b82791-6ef3-4a93-9d5a-84065d62775d\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.610776 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.610798 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.610810 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.610824 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2wh2\" (UniqueName: \"kubernetes.io/projected/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-kube-api-access-s2wh2\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.610840 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.614327 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" (UID: "fd9a8546-e61b-47e0-90b9-e6c8e4365b0b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.614516 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53268342-9adb-48b3-ba5b-52634c2c68fe-kube-api-access-fltcz" (OuterVolumeSpecName: "kube-api-access-fltcz") pod "53268342-9adb-48b3-ba5b-52634c2c68fe" (UID: "53268342-9adb-48b3-ba5b-52634c2c68fe"). InnerVolumeSpecName "kube-api-access-fltcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.614684 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02b82791-6ef3-4a93-9d5a-84065d62775d-kube-api-access-8kwkc" (OuterVolumeSpecName: "kube-api-access-8kwkc") pod "02b82791-6ef3-4a93-9d5a-84065d62775d" (UID: "02b82791-6ef3-4a93-9d5a-84065d62775d"). InnerVolumeSpecName "kube-api-access-8kwkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.616937 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-scripts" (OuterVolumeSpecName: "scripts") pod "02b82791-6ef3-4a93-9d5a-84065d62775d" (UID: "02b82791-6ef3-4a93-9d5a-84065d62775d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.617008 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-scripts" (OuterVolumeSpecName: "scripts") pod "53268342-9adb-48b3-ba5b-52634c2c68fe" (UID: "53268342-9adb-48b3-ba5b-52634c2c68fe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.637927 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53268342-9adb-48b3-ba5b-52634c2c68fe" (UID: "53268342-9adb-48b3-ba5b-52634c2c68fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.658357 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-config-data" (OuterVolumeSpecName: "config-data") pod "53268342-9adb-48b3-ba5b-52634c2c68fe" (UID: "53268342-9adb-48b3-ba5b-52634c2c68fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.660975 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02b82791-6ef3-4a93-9d5a-84065d62775d" (UID: "02b82791-6ef3-4a93-9d5a-84065d62775d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.663889 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-config-data" (OuterVolumeSpecName: "config-data") pod "02b82791-6ef3-4a93-9d5a-84065d62775d" (UID: "02b82791-6ef3-4a93-9d5a-84065d62775d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.713074 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.713137 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.713155 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.713180 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.713201 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fltcz\" (UniqueName: \"kubernetes.io/projected/53268342-9adb-48b3-ba5b-52634c2c68fe-kube-api-access-fltcz\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.713217 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.713235 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kwkc\" (UniqueName: \"kubernetes.io/projected/02b82791-6ef3-4a93-9d5a-84065d62775d-kube-api-access-8kwkc\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.713251 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.713265 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.994057 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xlps2" event={"ID":"53268342-9adb-48b3-ba5b-52634c2c68fe","Type":"ContainerDied","Data":"4965c8cb939b3555313bf5ff81aac80d2a106589cf591b13befb28e52d15f3d4"} Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.994102 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4965c8cb939b3555313bf5ff81aac80d2a106589cf591b13befb28e52d15f3d4" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.994209 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.998879 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" event={"ID":"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b","Type":"ContainerDied","Data":"3b4bbdde4e1a36119cc27a40f2a694902d8b5f53fa6c902b59c1385e734f5a5e"} Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.998950 4837 scope.go:117] "RemoveContainer" containerID="308f5a2ca30c72015ad1831a239549e973a6a698921b4916b0e838cdf0b49c8a" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.998976 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.000484 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8mzt4" event={"ID":"02b82791-6ef3-4a93-9d5a-84065d62775d","Type":"ContainerDied","Data":"8e78351e0267ed6011c0c508a5d86a34c1efe986a8b326b91fdf940d351283e1"} Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.000499 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.000574 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e78351e0267ed6011c0c508a5d86a34c1efe986a8b326b91fdf940d351283e1" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.030561 4837 scope.go:117] "RemoveContainer" containerID="18aeb282fbd8558fc7f2a4d93c502285e6ae25649a3f62cf2708ff5492d7993d" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.133360 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-txzkw"] Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.154930 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-txzkw"] Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.202629 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 12:08:50 crc kubenswrapper[4837]: E0313 12:08:50.203384 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02b82791-6ef3-4a93-9d5a-84065d62775d" containerName="nova-cell1-conductor-db-sync" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.207599 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="02b82791-6ef3-4a93-9d5a-84065d62775d" containerName="nova-cell1-conductor-db-sync" Mar 13 12:08:50 crc kubenswrapper[4837]: E0313 12:08:50.207821 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" containerName="init" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.207899 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" containerName="init" Mar 13 12:08:50 crc kubenswrapper[4837]: E0313 12:08:50.208009 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53268342-9adb-48b3-ba5b-52634c2c68fe" containerName="nova-manage" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.208091 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="53268342-9adb-48b3-ba5b-52634c2c68fe" containerName="nova-manage" Mar 13 12:08:50 crc kubenswrapper[4837]: E0313 12:08:50.208198 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" containerName="dnsmasq-dns" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.208274 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" containerName="dnsmasq-dns" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.208862 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="53268342-9adb-48b3-ba5b-52634c2c68fe" containerName="nova-manage" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.208968 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="02b82791-6ef3-4a93-9d5a-84065d62775d" containerName="nova-cell1-conductor-db-sync" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.209059 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" containerName="dnsmasq-dns" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.212889 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.224536 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.230968 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.235613 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a51debb-c1cb-4a55-b845-e89d89d11e86-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9a51debb-c1cb-4a55-b845-e89d89d11e86\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.236559 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a51debb-c1cb-4a55-b845-e89d89d11e86-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9a51debb-c1cb-4a55-b845-e89d89d11e86\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.236732 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx7rd\" (UniqueName: \"kubernetes.io/projected/9a51debb-c1cb-4a55-b845-e89d89d11e86-kube-api-access-hx7rd\") pod \"nova-cell1-conductor-0\" (UID: \"9a51debb-c1cb-4a55-b845-e89d89d11e86\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.320741 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.324713 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.324840 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.338443 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a51debb-c1cb-4a55-b845-e89d89d11e86-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9a51debb-c1cb-4a55-b845-e89d89d11e86\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.338497 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a51debb-c1cb-4a55-b845-e89d89d11e86-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9a51debb-c1cb-4a55-b845-e89d89d11e86\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.338537 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx7rd\" (UniqueName: \"kubernetes.io/projected/9a51debb-c1cb-4a55-b845-e89d89d11e86-kube-api-access-hx7rd\") pod \"nova-cell1-conductor-0\" (UID: \"9a51debb-c1cb-4a55-b845-e89d89d11e86\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.342792 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a51debb-c1cb-4a55-b845-e89d89d11e86-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9a51debb-c1cb-4a55-b845-e89d89d11e86\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.346276 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.346595 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f3179576-07e2-4e05-8d10-01e3d694863b" containerName="nova-api-log" containerID="cri-o://0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397" gracePeriod=30 Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.346728 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f3179576-07e2-4e05-8d10-01e3d694863b" containerName="nova-api-api" containerID="cri-o://ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a" gracePeriod=30 Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.357700 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.366311 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx7rd\" (UniqueName: \"kubernetes.io/projected/9a51debb-c1cb-4a55-b845-e89d89d11e86-kube-api-access-hx7rd\") pod \"nova-cell1-conductor-0\" (UID: \"9a51debb-c1cb-4a55-b845-e89d89d11e86\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.367210 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a51debb-c1cb-4a55-b845-e89d89d11e86-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9a51debb-c1cb-4a55-b845-e89d89d11e86\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.550706 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.993976 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 12:08:50 crc kubenswrapper[4837]: W0313 12:08:50.997360 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a51debb_c1cb_4a55_b845_e89d89d11e86.slice/crio-8cfc11ef296434fbdcd112fcff3fcf482dfc699684f6173db012822b534b56d7 WatchSource:0}: Error finding container 8cfc11ef296434fbdcd112fcff3fcf482dfc699684f6173db012822b534b56d7: Status 404 returned error can't find the container with id 8cfc11ef296434fbdcd112fcff3fcf482dfc699684f6173db012822b534b56d7 Mar 13 12:08:51 crc kubenswrapper[4837]: I0313 12:08:51.014205 4837 generic.go:334] "Generic (PLEG): container finished" podID="f3179576-07e2-4e05-8d10-01e3d694863b" containerID="0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397" exitCode=143 Mar 13 12:08:51 crc kubenswrapper[4837]: I0313 12:08:51.014276 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3179576-07e2-4e05-8d10-01e3d694863b","Type":"ContainerDied","Data":"0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397"} Mar 13 12:08:51 crc kubenswrapper[4837]: I0313 12:08:51.016902 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9a51debb-c1cb-4a55-b845-e89d89d11e86","Type":"ContainerStarted","Data":"8cfc11ef296434fbdcd112fcff3fcf482dfc699684f6173db012822b534b56d7"} Mar 13 12:08:51 crc kubenswrapper[4837]: I0313 12:08:51.026290 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="707289ff-1434-49b7-904a-58decfdd53ca" containerName="nova-scheduler-scheduler" containerID="cri-o://de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089" gracePeriod=30 Mar 13 12:08:51 crc kubenswrapper[4837]: I0313 12:08:51.071262 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" path="/var/lib/kubelet/pods/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b/volumes" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.045512 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" containerName="nova-metadata-log" containerID="cri-o://2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495" gracePeriod=30 Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.046808 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9a51debb-c1cb-4a55-b845-e89d89d11e86","Type":"ContainerStarted","Data":"c1cca02df9f56d80002d5370498f9f6c551789fcd9e78dfef99dca9ce9a40416"} Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.047078 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.047131 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" containerName="nova-metadata-metadata" containerID="cri-o://3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a" gracePeriod=30 Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.072455 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.072439004 podStartE2EDuration="2.072439004s" podCreationTimestamp="2026-03-13 12:08:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:52.065766794 +0000 UTC m=+1247.704033547" watchObservedRunningTime="2026-03-13 12:08:52.072439004 +0000 UTC m=+1247.710705767" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.622259 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.704857 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-combined-ca-bundle\") pod \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.704921 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-config-data\") pod \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.704957 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hzrv\" (UniqueName: \"kubernetes.io/projected/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-kube-api-access-5hzrv\") pod \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.705042 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-logs\") pod \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.705065 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-nova-metadata-tls-certs\") pod \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.706844 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-logs" (OuterVolumeSpecName: "logs") pod "4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" (UID: "4d0251c8-3594-482e-bd3c-2ca33c9e0ab5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.710444 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-kube-api-access-5hzrv" (OuterVolumeSpecName: "kube-api-access-5hzrv") pod "4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" (UID: "4d0251c8-3594-482e-bd3c-2ca33c9e0ab5"). InnerVolumeSpecName "kube-api-access-5hzrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.748350 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" (UID: "4d0251c8-3594-482e-bd3c-2ca33c9e0ab5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.752415 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-config-data" (OuterVolumeSpecName: "config-data") pod "4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" (UID: "4d0251c8-3594-482e-bd3c-2ca33c9e0ab5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.757723 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" (UID: "4d0251c8-3594-482e-bd3c-2ca33c9e0ab5"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.806708 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.806744 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.806754 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hzrv\" (UniqueName: \"kubernetes.io/projected/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-kube-api-access-5hzrv\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.806764 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.806774 4837 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.056721 4837 generic.go:334] "Generic (PLEG): container finished" podID="4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" containerID="3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a" exitCode=0 Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.056767 4837 generic.go:334] "Generic (PLEG): container finished" podID="4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" containerID="2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495" exitCode=143 Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.056951 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.067791 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5","Type":"ContainerDied","Data":"3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a"} Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.067866 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5","Type":"ContainerDied","Data":"2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495"} Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.067884 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5","Type":"ContainerDied","Data":"e3827fca94d00b12ca92bacbca73cd8bbf6d6f767ab5ac87612ce11ca72155b4"} Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.068012 4837 scope.go:117] "RemoveContainer" containerID="3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.107968 4837 scope.go:117] "RemoveContainer" containerID="2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.108786 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.128785 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.136513 4837 scope.go:117] "RemoveContainer" containerID="3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a" Mar 13 12:08:53 crc kubenswrapper[4837]: E0313 12:08:53.136967 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a\": container with ID starting with 3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a not found: ID does not exist" containerID="3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.137009 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a"} err="failed to get container status \"3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a\": rpc error: code = NotFound desc = could not find container \"3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a\": container with ID starting with 3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a not found: ID does not exist" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.137035 4837 scope.go:117] "RemoveContainer" containerID="2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495" Mar 13 12:08:53 crc kubenswrapper[4837]: E0313 12:08:53.137340 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495\": container with ID starting with 2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495 not found: ID does not exist" containerID="2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.137382 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495"} err="failed to get container status \"2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495\": rpc error: code = NotFound desc = could not find container \"2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495\": container with ID starting with 2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495 not found: ID does not exist" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.137407 4837 scope.go:117] "RemoveContainer" containerID="3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.137624 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a"} err="failed to get container status \"3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a\": rpc error: code = NotFound desc = could not find container \"3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a\": container with ID starting with 3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a not found: ID does not exist" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.137658 4837 scope.go:117] "RemoveContainer" containerID="2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.137871 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495"} err="failed to get container status \"2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495\": rpc error: code = NotFound desc = could not find container \"2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495\": container with ID starting with 2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495 not found: ID does not exist" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.139282 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:53 crc kubenswrapper[4837]: E0313 12:08:53.139696 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" containerName="nova-metadata-metadata" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.139708 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" containerName="nova-metadata-metadata" Mar 13 12:08:53 crc kubenswrapper[4837]: E0313 12:08:53.139747 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" containerName="nova-metadata-log" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.139753 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" containerName="nova-metadata-log" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.139915 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" containerName="nova-metadata-log" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.139936 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" containerName="nova-metadata-metadata" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.140935 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.143543 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.143888 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.150845 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.319933 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-config-data\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.320028 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7skk9\" (UniqueName: \"kubernetes.io/projected/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-kube-api-access-7skk9\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.320062 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.320107 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-logs\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.320192 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.422520 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7skk9\" (UniqueName: \"kubernetes.io/projected/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-kube-api-access-7skk9\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.422624 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.422685 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-logs\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.422771 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.422827 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-config-data\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.424466 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-logs\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.427353 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-config-data\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.427595 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.431668 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.450095 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7skk9\" (UniqueName: \"kubernetes.io/projected/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-kube-api-access-7skk9\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: E0313 12:08:53.494910 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 12:08:53 crc kubenswrapper[4837]: E0313 12:08:53.496155 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 12:08:53 crc kubenswrapper[4837]: E0313 12:08:53.497843 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 12:08:53 crc kubenswrapper[4837]: E0313 12:08:53.497936 4837 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="707289ff-1434-49b7-904a-58decfdd53ca" containerName="nova-scheduler-scheduler" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.510801 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.951302 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:53 crc kubenswrapper[4837]: W0313 12:08:53.954450 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5000e5ff_8cf6_4f0c_a6c4_e6b550c2fe43.slice/crio-c2ff21ee05eb4c0cd65e5feb281a54f68d478fb493c97488ec3ad06bbc0f4880 WatchSource:0}: Error finding container c2ff21ee05eb4c0cd65e5feb281a54f68d478fb493c97488ec3ad06bbc0f4880: Status 404 returned error can't find the container with id c2ff21ee05eb4c0cd65e5feb281a54f68d478fb493c97488ec3ad06bbc0f4880 Mar 13 12:08:54 crc kubenswrapper[4837]: I0313 12:08:54.067670 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43","Type":"ContainerStarted","Data":"c2ff21ee05eb4c0cd65e5feb281a54f68d478fb493c97488ec3ad06bbc0f4880"} Mar 13 12:08:54 crc kubenswrapper[4837]: I0313 12:08:54.710198 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:08:54 crc kubenswrapper[4837]: I0313 12:08:54.888589 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707289ff-1434-49b7-904a-58decfdd53ca-combined-ca-bundle\") pod \"707289ff-1434-49b7-904a-58decfdd53ca\" (UID: \"707289ff-1434-49b7-904a-58decfdd53ca\") " Mar 13 12:08:54 crc kubenswrapper[4837]: I0313 12:08:54.888898 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzqjg\" (UniqueName: \"kubernetes.io/projected/707289ff-1434-49b7-904a-58decfdd53ca-kube-api-access-jzqjg\") pod \"707289ff-1434-49b7-904a-58decfdd53ca\" (UID: \"707289ff-1434-49b7-904a-58decfdd53ca\") " Mar 13 12:08:54 crc kubenswrapper[4837]: I0313 12:08:54.888936 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707289ff-1434-49b7-904a-58decfdd53ca-config-data\") pod \"707289ff-1434-49b7-904a-58decfdd53ca\" (UID: \"707289ff-1434-49b7-904a-58decfdd53ca\") " Mar 13 12:08:54 crc kubenswrapper[4837]: I0313 12:08:54.894229 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/707289ff-1434-49b7-904a-58decfdd53ca-kube-api-access-jzqjg" (OuterVolumeSpecName: "kube-api-access-jzqjg") pod "707289ff-1434-49b7-904a-58decfdd53ca" (UID: "707289ff-1434-49b7-904a-58decfdd53ca"). InnerVolumeSpecName "kube-api-access-jzqjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:54 crc kubenswrapper[4837]: I0313 12:08:54.915085 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707289ff-1434-49b7-904a-58decfdd53ca-config-data" (OuterVolumeSpecName: "config-data") pod "707289ff-1434-49b7-904a-58decfdd53ca" (UID: "707289ff-1434-49b7-904a-58decfdd53ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:54 crc kubenswrapper[4837]: I0313 12:08:54.917806 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707289ff-1434-49b7-904a-58decfdd53ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "707289ff-1434-49b7-904a-58decfdd53ca" (UID: "707289ff-1434-49b7-904a-58decfdd53ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:54 crc kubenswrapper[4837]: I0313 12:08:54.994443 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzqjg\" (UniqueName: \"kubernetes.io/projected/707289ff-1434-49b7-904a-58decfdd53ca-kube-api-access-jzqjg\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:54 crc kubenswrapper[4837]: I0313 12:08:54.994484 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707289ff-1434-49b7-904a-58decfdd53ca-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:54 crc kubenswrapper[4837]: I0313 12:08:54.994495 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707289ff-1434-49b7-904a-58decfdd53ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.038612 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.074366 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" path="/var/lib/kubelet/pods/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5/volumes" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.085899 4837 generic.go:334] "Generic (PLEG): container finished" podID="f3179576-07e2-4e05-8d10-01e3d694863b" containerID="ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a" exitCode=0 Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.085990 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3179576-07e2-4e05-8d10-01e3d694863b","Type":"ContainerDied","Data":"ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a"} Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.089365 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3179576-07e2-4e05-8d10-01e3d694863b","Type":"ContainerDied","Data":"462ea19aaa8a4b2f42cf4a80e03784c4432ff7806e973f2c0cf7363762b9df8e"} Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.089412 4837 scope.go:117] "RemoveContainer" containerID="ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.089550 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.099537 4837 generic.go:334] "Generic (PLEG): container finished" podID="707289ff-1434-49b7-904a-58decfdd53ca" containerID="de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089" exitCode=0 Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.099594 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.099661 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"707289ff-1434-49b7-904a-58decfdd53ca","Type":"ContainerDied","Data":"de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089"} Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.099700 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"707289ff-1434-49b7-904a-58decfdd53ca","Type":"ContainerDied","Data":"b6785bc0ca408832d28cd32714ea145d9a6e0bbc829424d2cf876cff8cb2427b"} Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.105291 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43","Type":"ContainerStarted","Data":"456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e"} Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.105341 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43","Type":"ContainerStarted","Data":"9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98"} Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.116964 4837 scope.go:117] "RemoveContainer" containerID="0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.137072 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.137036845 podStartE2EDuration="2.137036845s" podCreationTimestamp="2026-03-13 12:08:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:55.127241106 +0000 UTC m=+1250.765507899" watchObservedRunningTime="2026-03-13 12:08:55.137036845 +0000 UTC m=+1250.775303608" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.144474 4837 scope.go:117] "RemoveContainer" containerID="ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a" Mar 13 12:08:55 crc kubenswrapper[4837]: E0313 12:08:55.144916 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a\": container with ID starting with ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a not found: ID does not exist" containerID="ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.144959 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a"} err="failed to get container status \"ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a\": rpc error: code = NotFound desc = could not find container \"ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a\": container with ID starting with ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a not found: ID does not exist" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.144988 4837 scope.go:117] "RemoveContainer" containerID="0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397" Mar 13 12:08:55 crc kubenswrapper[4837]: E0313 12:08:55.145726 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397\": container with ID starting with 0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397 not found: ID does not exist" containerID="0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.145769 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397"} err="failed to get container status \"0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397\": rpc error: code = NotFound desc = could not find container \"0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397\": container with ID starting with 0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397 not found: ID does not exist" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.145794 4837 scope.go:117] "RemoveContainer" containerID="de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.149246 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.163103 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.171234 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:08:55 crc kubenswrapper[4837]: E0313 12:08:55.172672 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3179576-07e2-4e05-8d10-01e3d694863b" containerName="nova-api-api" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.172698 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3179576-07e2-4e05-8d10-01e3d694863b" containerName="nova-api-api" Mar 13 12:08:55 crc kubenswrapper[4837]: E0313 12:08:55.172716 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="707289ff-1434-49b7-904a-58decfdd53ca" containerName="nova-scheduler-scheduler" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.172726 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="707289ff-1434-49b7-904a-58decfdd53ca" containerName="nova-scheduler-scheduler" Mar 13 12:08:55 crc kubenswrapper[4837]: E0313 12:08:55.172745 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3179576-07e2-4e05-8d10-01e3d694863b" containerName="nova-api-log" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.172754 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3179576-07e2-4e05-8d10-01e3d694863b" containerName="nova-api-log" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.172980 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3179576-07e2-4e05-8d10-01e3d694863b" containerName="nova-api-api" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.173008 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3179576-07e2-4e05-8d10-01e3d694863b" containerName="nova-api-log" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.173027 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="707289ff-1434-49b7-904a-58decfdd53ca" containerName="nova-scheduler-scheduler" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.175290 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.177896 4837 scope.go:117] "RemoveContainer" containerID="de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.178142 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 12:08:55 crc kubenswrapper[4837]: E0313 12:08:55.179067 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089\": container with ID starting with de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089 not found: ID does not exist" containerID="de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.179096 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089"} err="failed to get container status \"de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089\": rpc error: code = NotFound desc = could not find container \"de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089\": container with ID starting with de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089 not found: ID does not exist" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.198599 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3179576-07e2-4e05-8d10-01e3d694863b-logs\") pod \"f3179576-07e2-4e05-8d10-01e3d694863b\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.198741 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3179576-07e2-4e05-8d10-01e3d694863b-config-data\") pod \"f3179576-07e2-4e05-8d10-01e3d694863b\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.198900 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85cfl\" (UniqueName: \"kubernetes.io/projected/f3179576-07e2-4e05-8d10-01e3d694863b-kube-api-access-85cfl\") pod \"f3179576-07e2-4e05-8d10-01e3d694863b\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.198942 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3179576-07e2-4e05-8d10-01e3d694863b-combined-ca-bundle\") pod \"f3179576-07e2-4e05-8d10-01e3d694863b\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.206931 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3179576-07e2-4e05-8d10-01e3d694863b-logs" (OuterVolumeSpecName: "logs") pod "f3179576-07e2-4e05-8d10-01e3d694863b" (UID: "f3179576-07e2-4e05-8d10-01e3d694863b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.214385 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3179576-07e2-4e05-8d10-01e3d694863b-kube-api-access-85cfl" (OuterVolumeSpecName: "kube-api-access-85cfl") pod "f3179576-07e2-4e05-8d10-01e3d694863b" (UID: "f3179576-07e2-4e05-8d10-01e3d694863b"). InnerVolumeSpecName "kube-api-access-85cfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.221947 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.232718 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3179576-07e2-4e05-8d10-01e3d694863b-config-data" (OuterVolumeSpecName: "config-data") pod "f3179576-07e2-4e05-8d10-01e3d694863b" (UID: "f3179576-07e2-4e05-8d10-01e3d694863b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.244032 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3179576-07e2-4e05-8d10-01e3d694863b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3179576-07e2-4e05-8d10-01e3d694863b" (UID: "f3179576-07e2-4e05-8d10-01e3d694863b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.300650 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cc7473d-2608-4989-990f-a19d70e8a3a3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4cc7473d-2608-4989-990f-a19d70e8a3a3\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.300794 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xqdx\" (UniqueName: \"kubernetes.io/projected/4cc7473d-2608-4989-990f-a19d70e8a3a3-kube-api-access-7xqdx\") pod \"nova-scheduler-0\" (UID: \"4cc7473d-2608-4989-990f-a19d70e8a3a3\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.300954 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cc7473d-2608-4989-990f-a19d70e8a3a3-config-data\") pod \"nova-scheduler-0\" (UID: \"4cc7473d-2608-4989-990f-a19d70e8a3a3\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.301013 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3179576-07e2-4e05-8d10-01e3d694863b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.301026 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3179576-07e2-4e05-8d10-01e3d694863b-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.301035 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3179576-07e2-4e05-8d10-01e3d694863b-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.301045 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85cfl\" (UniqueName: \"kubernetes.io/projected/f3179576-07e2-4e05-8d10-01e3d694863b-kube-api-access-85cfl\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.402199 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xqdx\" (UniqueName: \"kubernetes.io/projected/4cc7473d-2608-4989-990f-a19d70e8a3a3-kube-api-access-7xqdx\") pod \"nova-scheduler-0\" (UID: \"4cc7473d-2608-4989-990f-a19d70e8a3a3\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.402368 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cc7473d-2608-4989-990f-a19d70e8a3a3-config-data\") pod \"nova-scheduler-0\" (UID: \"4cc7473d-2608-4989-990f-a19d70e8a3a3\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.402415 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cc7473d-2608-4989-990f-a19d70e8a3a3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4cc7473d-2608-4989-990f-a19d70e8a3a3\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.406549 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cc7473d-2608-4989-990f-a19d70e8a3a3-config-data\") pod \"nova-scheduler-0\" (UID: \"4cc7473d-2608-4989-990f-a19d70e8a3a3\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.413589 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cc7473d-2608-4989-990f-a19d70e8a3a3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4cc7473d-2608-4989-990f-a19d70e8a3a3\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.424165 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.424663 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xqdx\" (UniqueName: \"kubernetes.io/projected/4cc7473d-2608-4989-990f-a19d70e8a3a3-kube-api-access-7xqdx\") pod \"nova-scheduler-0\" (UID: \"4cc7473d-2608-4989-990f-a19d70e8a3a3\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.433732 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.461285 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.463185 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.465577 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.474462 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:08:55 crc kubenswrapper[4837]: E0313 12:08:55.525416 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3179576_07e2_4e05_8d10_01e3d694863b.slice/crio-462ea19aaa8a4b2f42cf4a80e03784c4432ff7806e973f2c0cf7363762b9df8e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3179576_07e2_4e05_8d10_01e3d694863b.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.605154 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534b3e48-da2d-41b6-af02-bef43adcac21-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.605233 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/534b3e48-da2d-41b6-af02-bef43adcac21-logs\") pod \"nova-api-0\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.605338 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/534b3e48-da2d-41b6-af02-bef43adcac21-config-data\") pod \"nova-api-0\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.605383 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxww9\" (UniqueName: \"kubernetes.io/projected/534b3e48-da2d-41b6-af02-bef43adcac21-kube-api-access-hxww9\") pod \"nova-api-0\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.630959 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.711312 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/534b3e48-da2d-41b6-af02-bef43adcac21-config-data\") pod \"nova-api-0\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.711370 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxww9\" (UniqueName: \"kubernetes.io/projected/534b3e48-da2d-41b6-af02-bef43adcac21-kube-api-access-hxww9\") pod \"nova-api-0\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.711461 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534b3e48-da2d-41b6-af02-bef43adcac21-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.711501 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/534b3e48-da2d-41b6-af02-bef43adcac21-logs\") pod \"nova-api-0\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.711939 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/534b3e48-da2d-41b6-af02-bef43adcac21-logs\") pod \"nova-api-0\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.722876 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534b3e48-da2d-41b6-af02-bef43adcac21-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.722974 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/534b3e48-da2d-41b6-af02-bef43adcac21-config-data\") pod \"nova-api-0\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.735825 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxww9\" (UniqueName: \"kubernetes.io/projected/534b3e48-da2d-41b6-af02-bef43adcac21-kube-api-access-hxww9\") pod \"nova-api-0\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.824721 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:08:56 crc kubenswrapper[4837]: W0313 12:08:56.073319 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cc7473d_2608_4989_990f_a19d70e8a3a3.slice/crio-7499f46006750dbda5f6e6ebf4a383aa5e9cd8afe4efcdd2e54c6f88540d0454 WatchSource:0}: Error finding container 7499f46006750dbda5f6e6ebf4a383aa5e9cd8afe4efcdd2e54c6f88540d0454: Status 404 returned error can't find the container with id 7499f46006750dbda5f6e6ebf4a383aa5e9cd8afe4efcdd2e54c6f88540d0454 Mar 13 12:08:56 crc kubenswrapper[4837]: I0313 12:08:56.074555 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:08:56 crc kubenswrapper[4837]: I0313 12:08:56.118364 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4cc7473d-2608-4989-990f-a19d70e8a3a3","Type":"ContainerStarted","Data":"7499f46006750dbda5f6e6ebf4a383aa5e9cd8afe4efcdd2e54c6f88540d0454"} Mar 13 12:08:56 crc kubenswrapper[4837]: I0313 12:08:56.236990 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:08:56 crc kubenswrapper[4837]: W0313 12:08:56.239402 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod534b3e48_da2d_41b6_af02_bef43adcac21.slice/crio-bcd98412fdb19b2343b2d9cf6ae91d9ffecdeb486a55232ac24c6fe9606a8706 WatchSource:0}: Error finding container bcd98412fdb19b2343b2d9cf6ae91d9ffecdeb486a55232ac24c6fe9606a8706: Status 404 returned error can't find the container with id bcd98412fdb19b2343b2d9cf6ae91d9ffecdeb486a55232ac24c6fe9606a8706 Mar 13 12:08:57 crc kubenswrapper[4837]: I0313 12:08:57.060876 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="707289ff-1434-49b7-904a-58decfdd53ca" path="/var/lib/kubelet/pods/707289ff-1434-49b7-904a-58decfdd53ca/volumes" Mar 13 12:08:57 crc kubenswrapper[4837]: I0313 12:08:57.062440 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3179576-07e2-4e05-8d10-01e3d694863b" path="/var/lib/kubelet/pods/f3179576-07e2-4e05-8d10-01e3d694863b/volumes" Mar 13 12:08:57 crc kubenswrapper[4837]: I0313 12:08:57.129667 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4cc7473d-2608-4989-990f-a19d70e8a3a3","Type":"ContainerStarted","Data":"84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7"} Mar 13 12:08:57 crc kubenswrapper[4837]: I0313 12:08:57.132948 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"534b3e48-da2d-41b6-af02-bef43adcac21","Type":"ContainerStarted","Data":"8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6"} Mar 13 12:08:57 crc kubenswrapper[4837]: I0313 12:08:57.133164 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"534b3e48-da2d-41b6-af02-bef43adcac21","Type":"ContainerStarted","Data":"723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a"} Mar 13 12:08:57 crc kubenswrapper[4837]: I0313 12:08:57.133241 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"534b3e48-da2d-41b6-af02-bef43adcac21","Type":"ContainerStarted","Data":"bcd98412fdb19b2343b2d9cf6ae91d9ffecdeb486a55232ac24c6fe9606a8706"} Mar 13 12:08:57 crc kubenswrapper[4837]: I0313 12:08:57.156728 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.156703975 podStartE2EDuration="2.156703975s" podCreationTimestamp="2026-03-13 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:57.145888235 +0000 UTC m=+1252.784154998" watchObservedRunningTime="2026-03-13 12:08:57.156703975 +0000 UTC m=+1252.794970738" Mar 13 12:08:57 crc kubenswrapper[4837]: I0313 12:08:57.170769 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.170741507 podStartE2EDuration="2.170741507s" podCreationTimestamp="2026-03-13 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:57.162924801 +0000 UTC m=+1252.801191584" watchObservedRunningTime="2026-03-13 12:08:57.170741507 +0000 UTC m=+1252.809008270" Mar 13 12:08:58 crc kubenswrapper[4837]: I0313 12:08:58.510941 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 12:08:58 crc kubenswrapper[4837]: I0313 12:08:58.511289 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 12:09:00 crc kubenswrapper[4837]: I0313 12:09:00.580455 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 13 12:09:00 crc kubenswrapper[4837]: I0313 12:09:00.632059 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 12:09:02 crc kubenswrapper[4837]: I0313 12:09:02.468751 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 13 12:09:03 crc kubenswrapper[4837]: I0313 12:09:03.511147 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 12:09:03 crc kubenswrapper[4837]: I0313 12:09:03.511403 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 12:09:04 crc kubenswrapper[4837]: I0313 12:09:04.524840 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 12:09:04 crc kubenswrapper[4837]: I0313 12:09:04.524885 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 12:09:05 crc kubenswrapper[4837]: I0313 12:09:05.631262 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 12:09:05 crc kubenswrapper[4837]: I0313 12:09:05.668869 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 12:09:05 crc kubenswrapper[4837]: I0313 12:09:05.827500 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 12:09:05 crc kubenswrapper[4837]: I0313 12:09:05.827562 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 12:09:05 crc kubenswrapper[4837]: I0313 12:09:05.944215 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:09:05 crc kubenswrapper[4837]: I0313 12:09:05.944402 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="a250849d-ca15-40fa-8b1d-a32b5abc6861" containerName="kube-state-metrics" containerID="cri-o://07fc1a83feb8d7932c2b80f34ffbd6218ef230bb996e92d9892feae57b23c402" gracePeriod=30 Mar 13 12:09:06 crc kubenswrapper[4837]: I0313 12:09:06.223914 4837 generic.go:334] "Generic (PLEG): container finished" podID="a250849d-ca15-40fa-8b1d-a32b5abc6861" containerID="07fc1a83feb8d7932c2b80f34ffbd6218ef230bb996e92d9892feae57b23c402" exitCode=2 Mar 13 12:09:06 crc kubenswrapper[4837]: I0313 12:09:06.224916 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a250849d-ca15-40fa-8b1d-a32b5abc6861","Type":"ContainerDied","Data":"07fc1a83feb8d7932c2b80f34ffbd6218ef230bb996e92d9892feae57b23c402"} Mar 13 12:09:06 crc kubenswrapper[4837]: I0313 12:09:06.273860 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 12:09:06 crc kubenswrapper[4837]: I0313 12:09:06.687923 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 12:09:06 crc kubenswrapper[4837]: I0313 12:09:06.842940 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vwh6\" (UniqueName: \"kubernetes.io/projected/a250849d-ca15-40fa-8b1d-a32b5abc6861-kube-api-access-9vwh6\") pod \"a250849d-ca15-40fa-8b1d-a32b5abc6861\" (UID: \"a250849d-ca15-40fa-8b1d-a32b5abc6861\") " Mar 13 12:09:06 crc kubenswrapper[4837]: I0313 12:09:06.850724 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a250849d-ca15-40fa-8b1d-a32b5abc6861-kube-api-access-9vwh6" (OuterVolumeSpecName: "kube-api-access-9vwh6") pod "a250849d-ca15-40fa-8b1d-a32b5abc6861" (UID: "a250849d-ca15-40fa-8b1d-a32b5abc6861"). InnerVolumeSpecName "kube-api-access-9vwh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:06 crc kubenswrapper[4837]: I0313 12:09:06.909842 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="534b3e48-da2d-41b6-af02-bef43adcac21" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 12:09:06 crc kubenswrapper[4837]: I0313 12:09:06.909877 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="534b3e48-da2d-41b6-af02-bef43adcac21" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 12:09:06 crc kubenswrapper[4837]: I0313 12:09:06.945493 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vwh6\" (UniqueName: \"kubernetes.io/projected/a250849d-ca15-40fa-8b1d-a32b5abc6861-kube-api-access-9vwh6\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.236656 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a250849d-ca15-40fa-8b1d-a32b5abc6861","Type":"ContainerDied","Data":"7a22f32b80bf3ec02fab7028c9c981153ef89481c11b18583b8c1e3f0c67df24"} Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.236670 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.237032 4837 scope.go:117] "RemoveContainer" containerID="07fc1a83feb8d7932c2b80f34ffbd6218ef230bb996e92d9892feae57b23c402" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.277280 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.292521 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.303664 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:09:07 crc kubenswrapper[4837]: E0313 12:09:07.304167 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a250849d-ca15-40fa-8b1d-a32b5abc6861" containerName="kube-state-metrics" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.304192 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a250849d-ca15-40fa-8b1d-a32b5abc6861" containerName="kube-state-metrics" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.304411 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a250849d-ca15-40fa-8b1d-a32b5abc6861" containerName="kube-state-metrics" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.305177 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.307263 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.308087 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.313712 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.455753 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd69ff2-e72e-40c0-925f-d0c1c0a40f9a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a\") " pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.455855 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79vvx\" (UniqueName: \"kubernetes.io/projected/abd69ff2-e72e-40c0-925f-d0c1c0a40f9a-kube-api-access-79vvx\") pod \"kube-state-metrics-0\" (UID: \"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a\") " pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.455880 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd69ff2-e72e-40c0-925f-d0c1c0a40f9a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a\") " pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.455899 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/abd69ff2-e72e-40c0-925f-d0c1c0a40f9a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a\") " pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.558537 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd69ff2-e72e-40c0-925f-d0c1c0a40f9a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a\") " pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.558588 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79vvx\" (UniqueName: \"kubernetes.io/projected/abd69ff2-e72e-40c0-925f-d0c1c0a40f9a-kube-api-access-79vvx\") pod \"kube-state-metrics-0\" (UID: \"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a\") " pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.558625 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd69ff2-e72e-40c0-925f-d0c1c0a40f9a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a\") " pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.558668 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/abd69ff2-e72e-40c0-925f-d0c1c0a40f9a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a\") " pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.563671 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd69ff2-e72e-40c0-925f-d0c1c0a40f9a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a\") " pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.564274 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/abd69ff2-e72e-40c0-925f-d0c1c0a40f9a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a\") " pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.565370 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd69ff2-e72e-40c0-925f-d0c1c0a40f9a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a\") " pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.573147 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79vvx\" (UniqueName: \"kubernetes.io/projected/abd69ff2-e72e-40c0-925f-d0c1c0a40f9a-kube-api-access-79vvx\") pod \"kube-state-metrics-0\" (UID: \"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a\") " pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.624337 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 12:09:08 crc kubenswrapper[4837]: I0313 12:09:08.061403 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:08 crc kubenswrapper[4837]: I0313 12:09:08.062006 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="ceilometer-central-agent" containerID="cri-o://d8b81a1d862c648975bd9a812fe1d61df727077dd39a97f4adfc70dac6066075" gracePeriod=30 Mar 13 12:09:08 crc kubenswrapper[4837]: I0313 12:09:08.062139 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="proxy-httpd" containerID="cri-o://94da49d6a7255e5847d10069ee75dd614b4c6eea7e080a518814f780623556e5" gracePeriod=30 Mar 13 12:09:08 crc kubenswrapper[4837]: I0313 12:09:08.062186 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="sg-core" containerID="cri-o://a9f4ef9baf51c5a45fe25c828b539addde1c0065712a676f95056b2183f00569" gracePeriod=30 Mar 13 12:09:08 crc kubenswrapper[4837]: I0313 12:09:08.062228 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="ceilometer-notification-agent" containerID="cri-o://6d24d7cecf025123d4d281213efc8079b0cb18a3f100808ee593959500d93094" gracePeriod=30 Mar 13 12:09:08 crc kubenswrapper[4837]: I0313 12:09:08.150844 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:09:08 crc kubenswrapper[4837]: I0313 12:09:08.261203 4837 generic.go:334] "Generic (PLEG): container finished" podID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerID="a9f4ef9baf51c5a45fe25c828b539addde1c0065712a676f95056b2183f00569" exitCode=2 Mar 13 12:09:08 crc kubenswrapper[4837]: I0313 12:09:08.261555 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7f70330-cb87-42e5-96c8-6d54828f2a5a","Type":"ContainerDied","Data":"a9f4ef9baf51c5a45fe25c828b539addde1c0065712a676f95056b2183f00569"} Mar 13 12:09:08 crc kubenswrapper[4837]: I0313 12:09:08.262989 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a","Type":"ContainerStarted","Data":"8afb9347869a58b1a54ddf30e6a1b29a5a1fcc55ece8e9ecf5f34ecb84524951"} Mar 13 12:09:09 crc kubenswrapper[4837]: I0313 12:09:09.068907 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a250849d-ca15-40fa-8b1d-a32b5abc6861" path="/var/lib/kubelet/pods/a250849d-ca15-40fa-8b1d-a32b5abc6861/volumes" Mar 13 12:09:09 crc kubenswrapper[4837]: I0313 12:09:09.277190 4837 generic.go:334] "Generic (PLEG): container finished" podID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerID="94da49d6a7255e5847d10069ee75dd614b4c6eea7e080a518814f780623556e5" exitCode=0 Mar 13 12:09:09 crc kubenswrapper[4837]: I0313 12:09:09.277237 4837 generic.go:334] "Generic (PLEG): container finished" podID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerID="d8b81a1d862c648975bd9a812fe1d61df727077dd39a97f4adfc70dac6066075" exitCode=0 Mar 13 12:09:09 crc kubenswrapper[4837]: I0313 12:09:09.277275 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7f70330-cb87-42e5-96c8-6d54828f2a5a","Type":"ContainerDied","Data":"94da49d6a7255e5847d10069ee75dd614b4c6eea7e080a518814f780623556e5"} Mar 13 12:09:09 crc kubenswrapper[4837]: I0313 12:09:09.277339 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7f70330-cb87-42e5-96c8-6d54828f2a5a","Type":"ContainerDied","Data":"d8b81a1d862c648975bd9a812fe1d61df727077dd39a97f4adfc70dac6066075"} Mar 13 12:09:09 crc kubenswrapper[4837]: I0313 12:09:09.279977 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a","Type":"ContainerStarted","Data":"f66785422b595e76fe8cdbc3485cda087523c65c31dd5c0f304d466dab6a34ce"} Mar 13 12:09:09 crc kubenswrapper[4837]: I0313 12:09:09.280112 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 13 12:09:09 crc kubenswrapper[4837]: I0313 12:09:09.305054 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.933464064 podStartE2EDuration="2.305035574s" podCreationTimestamp="2026-03-13 12:09:07 +0000 UTC" firstStartedPulling="2026-03-13 12:09:08.150996528 +0000 UTC m=+1263.789263291" lastFinishedPulling="2026-03-13 12:09:08.522568018 +0000 UTC m=+1264.160834801" observedRunningTime="2026-03-13 12:09:09.303438884 +0000 UTC m=+1264.941705667" watchObservedRunningTime="2026-03-13 12:09:09.305035574 +0000 UTC m=+1264.943302337" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.316466 4837 generic.go:334] "Generic (PLEG): container finished" podID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerID="6d24d7cecf025123d4d281213efc8079b0cb18a3f100808ee593959500d93094" exitCode=0 Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.316509 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7f70330-cb87-42e5-96c8-6d54828f2a5a","Type":"ContainerDied","Data":"6d24d7cecf025123d4d281213efc8079b0cb18a3f100808ee593959500d93094"} Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.318749 4837 generic.go:334] "Generic (PLEG): container finished" podID="81ec286a-b6df-4462-8023-c01230a50793" containerID="2b72c4b74ac632994ae39578139216d840009de89378dfe0823503769ad992b6" exitCode=137 Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.318789 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"81ec286a-b6df-4462-8023-c01230a50793","Type":"ContainerDied","Data":"2b72c4b74ac632994ae39578139216d840009de89378dfe0823503769ad992b6"} Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.318816 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"81ec286a-b6df-4462-8023-c01230a50793","Type":"ContainerDied","Data":"78f61644c1756b2a1acf80d548b16d064b0de263e518cd87a1b42cea8c63088a"} Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.318830 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78f61644c1756b2a1acf80d548b16d064b0de263e518cd87a1b42cea8c63088a" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.406135 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.517140 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.517718 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.521134 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.577365 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81ec286a-b6df-4462-8023-c01230a50793-config-data\") pod \"81ec286a-b6df-4462-8023-c01230a50793\" (UID: \"81ec286a-b6df-4462-8023-c01230a50793\") " Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.577511 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srf5b\" (UniqueName: \"kubernetes.io/projected/81ec286a-b6df-4462-8023-c01230a50793-kube-api-access-srf5b\") pod \"81ec286a-b6df-4462-8023-c01230a50793\" (UID: \"81ec286a-b6df-4462-8023-c01230a50793\") " Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.577566 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ec286a-b6df-4462-8023-c01230a50793-combined-ca-bundle\") pod \"81ec286a-b6df-4462-8023-c01230a50793\" (UID: \"81ec286a-b6df-4462-8023-c01230a50793\") " Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.584836 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81ec286a-b6df-4462-8023-c01230a50793-kube-api-access-srf5b" (OuterVolumeSpecName: "kube-api-access-srf5b") pod "81ec286a-b6df-4462-8023-c01230a50793" (UID: "81ec286a-b6df-4462-8023-c01230a50793"). InnerVolumeSpecName "kube-api-access-srf5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.604718 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81ec286a-b6df-4462-8023-c01230a50793-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81ec286a-b6df-4462-8023-c01230a50793" (UID: "81ec286a-b6df-4462-8023-c01230a50793"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.607172 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81ec286a-b6df-4462-8023-c01230a50793-config-data" (OuterVolumeSpecName: "config-data") pod "81ec286a-b6df-4462-8023-c01230a50793" (UID: "81ec286a-b6df-4462-8023-c01230a50793"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.628792 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.680540 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81ec286a-b6df-4462-8023-c01230a50793-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.681005 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srf5b\" (UniqueName: \"kubernetes.io/projected/81ec286a-b6df-4462-8023-c01230a50793-kube-api-access-srf5b\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.681020 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ec286a-b6df-4462-8023-c01230a50793-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.782879 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-sg-core-conf-yaml\") pod \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.782981 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzlqd\" (UniqueName: \"kubernetes.io/projected/a7f70330-cb87-42e5-96c8-6d54828f2a5a-kube-api-access-lzlqd\") pod \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.783055 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-combined-ca-bundle\") pod \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.783116 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7f70330-cb87-42e5-96c8-6d54828f2a5a-run-httpd\") pod \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.783186 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-config-data\") pod \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.783258 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7f70330-cb87-42e5-96c8-6d54828f2a5a-log-httpd\") pod \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.783294 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-scripts\") pod \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.783802 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7f70330-cb87-42e5-96c8-6d54828f2a5a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a7f70330-cb87-42e5-96c8-6d54828f2a5a" (UID: "a7f70330-cb87-42e5-96c8-6d54828f2a5a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.783852 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7f70330-cb87-42e5-96c8-6d54828f2a5a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a7f70330-cb87-42e5-96c8-6d54828f2a5a" (UID: "a7f70330-cb87-42e5-96c8-6d54828f2a5a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.784364 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7f70330-cb87-42e5-96c8-6d54828f2a5a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.784385 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7f70330-cb87-42e5-96c8-6d54828f2a5a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.787290 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7f70330-cb87-42e5-96c8-6d54828f2a5a-kube-api-access-lzlqd" (OuterVolumeSpecName: "kube-api-access-lzlqd") pod "a7f70330-cb87-42e5-96c8-6d54828f2a5a" (UID: "a7f70330-cb87-42e5-96c8-6d54828f2a5a"). InnerVolumeSpecName "kube-api-access-lzlqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.788296 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-scripts" (OuterVolumeSpecName: "scripts") pod "a7f70330-cb87-42e5-96c8-6d54828f2a5a" (UID: "a7f70330-cb87-42e5-96c8-6d54828f2a5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.812880 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a7f70330-cb87-42e5-96c8-6d54828f2a5a" (UID: "a7f70330-cb87-42e5-96c8-6d54828f2a5a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.857165 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7f70330-cb87-42e5-96c8-6d54828f2a5a" (UID: "a7f70330-cb87-42e5-96c8-6d54828f2a5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.877176 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-config-data" (OuterVolumeSpecName: "config-data") pod "a7f70330-cb87-42e5-96c8-6d54828f2a5a" (UID: "a7f70330-cb87-42e5-96c8-6d54828f2a5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.886386 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzlqd\" (UniqueName: \"kubernetes.io/projected/a7f70330-cb87-42e5-96c8-6d54828f2a5a-kube-api-access-lzlqd\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.886419 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.886428 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.886437 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.886446 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.329360 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.329367 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7f70330-cb87-42e5-96c8-6d54828f2a5a","Type":"ContainerDied","Data":"c1caae87e2bfbe9657e4b62036ebd200f6d2445955d6ab66a4adb47c94a2fae0"} Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.330596 4837 scope.go:117] "RemoveContainer" containerID="94da49d6a7255e5847d10069ee75dd614b4c6eea7e080a518814f780623556e5" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.329511 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.336972 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.367351 4837 scope.go:117] "RemoveContainer" containerID="a9f4ef9baf51c5a45fe25c828b539addde1c0065712a676f95056b2183f00569" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.390037 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.399806 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.411814 4837 scope.go:117] "RemoveContainer" containerID="6d24d7cecf025123d4d281213efc8079b0cb18a3f100808ee593959500d93094" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.426526 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:09:14 crc kubenswrapper[4837]: E0313 12:09:14.427070 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81ec286a-b6df-4462-8023-c01230a50793" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.427092 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ec286a-b6df-4462-8023-c01230a50793" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 12:09:14 crc kubenswrapper[4837]: E0313 12:09:14.427124 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="ceilometer-central-agent" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.427133 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="ceilometer-central-agent" Mar 13 12:09:14 crc kubenswrapper[4837]: E0313 12:09:14.427158 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="ceilometer-notification-agent" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.427167 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="ceilometer-notification-agent" Mar 13 12:09:14 crc kubenswrapper[4837]: E0313 12:09:14.427177 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="proxy-httpd" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.427184 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="proxy-httpd" Mar 13 12:09:14 crc kubenswrapper[4837]: E0313 12:09:14.427200 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="sg-core" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.427210 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="sg-core" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.427442 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="ceilometer-central-agent" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.427462 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="81ec286a-b6df-4462-8023-c01230a50793" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.427486 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="proxy-httpd" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.427511 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="sg-core" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.427524 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="ceilometer-notification-agent" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.428297 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.441981 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.441981 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.442066 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.442450 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.458401 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.465376 4837 scope.go:117] "RemoveContainer" containerID="d8b81a1d862c648975bd9a812fe1d61df727077dd39a97f4adfc70dac6066075" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.475502 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.489035 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.491726 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.494041 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.494569 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.496753 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.499225 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.614172 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.614247 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.614309 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/662e258d-fe94-4373-912d-c906f1e93c90-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.614327 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/662e258d-fe94-4373-912d-c906f1e93c90-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.614378 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/662e258d-fe94-4373-912d-c906f1e93c90-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.614424 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87f0825e-ff58-4bf4-bf83-6522dcc333e2-run-httpd\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.614443 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmn4v\" (UniqueName: \"kubernetes.io/projected/662e258d-fe94-4373-912d-c906f1e93c90-kube-api-access-xmn4v\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.614463 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.614513 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzw9w\" (UniqueName: \"kubernetes.io/projected/87f0825e-ff58-4bf4-bf83-6522dcc333e2-kube-api-access-bzw9w\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.614743 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87f0825e-ff58-4bf4-bf83-6522dcc333e2-log-httpd\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.614895 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/662e258d-fe94-4373-912d-c906f1e93c90-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.614956 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-scripts\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.615095 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-config-data\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717083 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717434 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717494 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/662e258d-fe94-4373-912d-c906f1e93c90-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717518 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/662e258d-fe94-4373-912d-c906f1e93c90-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717557 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/662e258d-fe94-4373-912d-c906f1e93c90-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717587 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87f0825e-ff58-4bf4-bf83-6522dcc333e2-run-httpd\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717611 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmn4v\" (UniqueName: \"kubernetes.io/projected/662e258d-fe94-4373-912d-c906f1e93c90-kube-api-access-xmn4v\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717649 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717677 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzw9w\" (UniqueName: \"kubernetes.io/projected/87f0825e-ff58-4bf4-bf83-6522dcc333e2-kube-api-access-bzw9w\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717707 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87f0825e-ff58-4bf4-bf83-6522dcc333e2-log-httpd\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717732 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/662e258d-fe94-4373-912d-c906f1e93c90-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717748 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-scripts\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717775 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-config-data\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.718969 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87f0825e-ff58-4bf4-bf83-6522dcc333e2-log-httpd\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.719359 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87f0825e-ff58-4bf4-bf83-6522dcc333e2-run-httpd\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.721385 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.721676 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-scripts\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.721865 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/662e258d-fe94-4373-912d-c906f1e93c90-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.721946 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/662e258d-fe94-4373-912d-c906f1e93c90-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.723078 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-config-data\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.723513 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.727374 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.728343 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/662e258d-fe94-4373-912d-c906f1e93c90-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.737106 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/662e258d-fe94-4373-912d-c906f1e93c90-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.741394 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzw9w\" (UniqueName: \"kubernetes.io/projected/87f0825e-ff58-4bf4-bf83-6522dcc333e2-kube-api-access-bzw9w\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.749220 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmn4v\" (UniqueName: \"kubernetes.io/projected/662e258d-fe94-4373-912d-c906f1e93c90-kube-api-access-xmn4v\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.767734 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.807097 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:09:15 crc kubenswrapper[4837]: I0313 12:09:15.063757 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81ec286a-b6df-4462-8023-c01230a50793" path="/var/lib/kubelet/pods/81ec286a-b6df-4462-8023-c01230a50793/volumes" Mar 13 12:09:15 crc kubenswrapper[4837]: I0313 12:09:15.064507 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" path="/var/lib/kubelet/pods/a7f70330-cb87-42e5-96c8-6d54828f2a5a/volumes" Mar 13 12:09:15 crc kubenswrapper[4837]: I0313 12:09:15.260986 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:09:15 crc kubenswrapper[4837]: I0313 12:09:15.380655 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"662e258d-fe94-4373-912d-c906f1e93c90","Type":"ContainerStarted","Data":"e242c6f26aa5b6f1cc1df2ec9bee4085137c95b2dd848bcff17437ed302ab123"} Mar 13 12:09:15 crc kubenswrapper[4837]: I0313 12:09:15.385049 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:15 crc kubenswrapper[4837]: I0313 12:09:15.829751 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 12:09:15 crc kubenswrapper[4837]: I0313 12:09:15.830129 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 12:09:15 crc kubenswrapper[4837]: I0313 12:09:15.830556 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 12:09:15 crc kubenswrapper[4837]: I0313 12:09:15.830624 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 12:09:15 crc kubenswrapper[4837]: I0313 12:09:15.833395 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 12:09:15 crc kubenswrapper[4837]: I0313 12:09:15.835542 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.044883 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-ql9zn"] Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.046788 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.063823 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-ql9zn"] Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.144424 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.144804 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-config\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.144891 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.144966 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.145083 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.145114 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdwv8\" (UniqueName: \"kubernetes.io/projected/6d9c85e6-5c66-4c94-996b-0278453fd29c-kube-api-access-zdwv8\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.247268 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.247432 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.247570 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.247631 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdwv8\" (UniqueName: \"kubernetes.io/projected/6d9c85e6-5c66-4c94-996b-0278453fd29c-kube-api-access-zdwv8\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.247752 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.247790 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-config\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.248385 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.251448 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-config\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.252499 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.252515 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.252705 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.272953 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdwv8\" (UniqueName: \"kubernetes.io/projected/6d9c85e6-5c66-4c94-996b-0278453fd29c-kube-api-access-zdwv8\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.380787 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.394397 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"662e258d-fe94-4373-912d-c906f1e93c90","Type":"ContainerStarted","Data":"3ce5094b532dea20025f607d318aa78612e1278ac2854d7f4b06ea6f2e4d4746"} Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.402123 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87f0825e-ff58-4bf4-bf83-6522dcc333e2","Type":"ContainerStarted","Data":"26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88"} Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.402174 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87f0825e-ff58-4bf4-bf83-6522dcc333e2","Type":"ContainerStarted","Data":"8371eb3af69417b51c9486141ddff58bfc7ec752522d7b166877385a6b1e772e"} Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.410673 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.410656599 podStartE2EDuration="2.410656599s" podCreationTimestamp="2026-03-13 12:09:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:09:16.409917896 +0000 UTC m=+1272.048184659" watchObservedRunningTime="2026-03-13 12:09:16.410656599 +0000 UTC m=+1272.048923362" Mar 13 12:09:17 crc kubenswrapper[4837]: W0313 12:09:16.886731 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d9c85e6_5c66_4c94_996b_0278453fd29c.slice/crio-b047a2dec8a72a4326c0ddd3270cd5183bb922bd572e2b9241c600e148c1eea7 WatchSource:0}: Error finding container b047a2dec8a72a4326c0ddd3270cd5183bb922bd572e2b9241c600e148c1eea7: Status 404 returned error can't find the container with id b047a2dec8a72a4326c0ddd3270cd5183bb922bd572e2b9241c600e148c1eea7 Mar 13 12:09:17 crc kubenswrapper[4837]: I0313 12:09:16.896017 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-ql9zn"] Mar 13 12:09:17 crc kubenswrapper[4837]: I0313 12:09:17.412223 4837 generic.go:334] "Generic (PLEG): container finished" podID="6d9c85e6-5c66-4c94-996b-0278453fd29c" containerID="0ac8018727334fad931d8e9b782b5ff6d28c6c9743c0f7da2e79336a427ee5cf" exitCode=0 Mar 13 12:09:17 crc kubenswrapper[4837]: I0313 12:09:17.412302 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" event={"ID":"6d9c85e6-5c66-4c94-996b-0278453fd29c","Type":"ContainerDied","Data":"0ac8018727334fad931d8e9b782b5ff6d28c6c9743c0f7da2e79336a427ee5cf"} Mar 13 12:09:17 crc kubenswrapper[4837]: I0313 12:09:17.412614 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" event={"ID":"6d9c85e6-5c66-4c94-996b-0278453fd29c","Type":"ContainerStarted","Data":"b047a2dec8a72a4326c0ddd3270cd5183bb922bd572e2b9241c600e148c1eea7"} Mar 13 12:09:17 crc kubenswrapper[4837]: I0313 12:09:17.416646 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87f0825e-ff58-4bf4-bf83-6522dcc333e2","Type":"ContainerStarted","Data":"e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72"} Mar 13 12:09:17 crc kubenswrapper[4837]: I0313 12:09:17.642677 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 13 12:09:18 crc kubenswrapper[4837]: I0313 12:09:18.431225 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" event={"ID":"6d9c85e6-5c66-4c94-996b-0278453fd29c","Type":"ContainerStarted","Data":"daf8bcea9fd0562663127a8a93a369152f67e1407f2a8bee704558594419d5d6"} Mar 13 12:09:18 crc kubenswrapper[4837]: I0313 12:09:18.432361 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:18 crc kubenswrapper[4837]: I0313 12:09:18.437193 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87f0825e-ff58-4bf4-bf83-6522dcc333e2","Type":"ContainerStarted","Data":"0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1"} Mar 13 12:09:18 crc kubenswrapper[4837]: I0313 12:09:18.459584 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" podStartSLOduration=3.45955511 podStartE2EDuration="3.45955511s" podCreationTimestamp="2026-03-13 12:09:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:09:18.458385084 +0000 UTC m=+1274.096651847" watchObservedRunningTime="2026-03-13 12:09:18.45955511 +0000 UTC m=+1274.097821863" Mar 13 12:09:18 crc kubenswrapper[4837]: I0313 12:09:18.612490 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:09:18 crc kubenswrapper[4837]: I0313 12:09:18.612946 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="534b3e48-da2d-41b6-af02-bef43adcac21" containerName="nova-api-log" containerID="cri-o://723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a" gracePeriod=30 Mar 13 12:09:18 crc kubenswrapper[4837]: I0313 12:09:18.613116 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="534b3e48-da2d-41b6-af02-bef43adcac21" containerName="nova-api-api" containerID="cri-o://8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6" gracePeriod=30 Mar 13 12:09:19 crc kubenswrapper[4837]: I0313 12:09:19.146307 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:19 crc kubenswrapper[4837]: I0313 12:09:19.449379 4837 generic.go:334] "Generic (PLEG): container finished" podID="534b3e48-da2d-41b6-af02-bef43adcac21" containerID="723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a" exitCode=143 Mar 13 12:09:19 crc kubenswrapper[4837]: I0313 12:09:19.449448 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"534b3e48-da2d-41b6-af02-bef43adcac21","Type":"ContainerDied","Data":"723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a"} Mar 13 12:09:19 crc kubenswrapper[4837]: I0313 12:09:19.768902 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:20 crc kubenswrapper[4837]: I0313 12:09:20.462282 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87f0825e-ff58-4bf4-bf83-6522dcc333e2","Type":"ContainerStarted","Data":"770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8"} Mar 13 12:09:20 crc kubenswrapper[4837]: I0313 12:09:20.462442 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="proxy-httpd" containerID="cri-o://770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8" gracePeriod=30 Mar 13 12:09:20 crc kubenswrapper[4837]: I0313 12:09:20.462495 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 12:09:20 crc kubenswrapper[4837]: I0313 12:09:20.462443 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="sg-core" containerID="cri-o://0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1" gracePeriod=30 Mar 13 12:09:20 crc kubenswrapper[4837]: I0313 12:09:20.462409 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="ceilometer-central-agent" containerID="cri-o://26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88" gracePeriod=30 Mar 13 12:09:20 crc kubenswrapper[4837]: I0313 12:09:20.462530 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="ceilometer-notification-agent" containerID="cri-o://e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72" gracePeriod=30 Mar 13 12:09:20 crc kubenswrapper[4837]: I0313 12:09:20.494848 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.23566101 podStartE2EDuration="6.494827362s" podCreationTimestamp="2026-03-13 12:09:14 +0000 UTC" firstStartedPulling="2026-03-13 12:09:15.39019404 +0000 UTC m=+1271.028460803" lastFinishedPulling="2026-03-13 12:09:19.649360392 +0000 UTC m=+1275.287627155" observedRunningTime="2026-03-13 12:09:20.481139031 +0000 UTC m=+1276.119405804" watchObservedRunningTime="2026-03-13 12:09:20.494827362 +0000 UTC m=+1276.133094125" Mar 13 12:09:21 crc kubenswrapper[4837]: I0313 12:09:21.477719 4837 generic.go:334] "Generic (PLEG): container finished" podID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerID="770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8" exitCode=0 Mar 13 12:09:21 crc kubenswrapper[4837]: I0313 12:09:21.477808 4837 generic.go:334] "Generic (PLEG): container finished" podID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerID="0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1" exitCode=2 Mar 13 12:09:21 crc kubenswrapper[4837]: I0313 12:09:21.477829 4837 generic.go:334] "Generic (PLEG): container finished" podID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerID="e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72" exitCode=0 Mar 13 12:09:21 crc kubenswrapper[4837]: I0313 12:09:21.477758 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87f0825e-ff58-4bf4-bf83-6522dcc333e2","Type":"ContainerDied","Data":"770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8"} Mar 13 12:09:21 crc kubenswrapper[4837]: I0313 12:09:21.477882 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87f0825e-ff58-4bf4-bf83-6522dcc333e2","Type":"ContainerDied","Data":"0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1"} Mar 13 12:09:21 crc kubenswrapper[4837]: I0313 12:09:21.477904 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87f0825e-ff58-4bf4-bf83-6522dcc333e2","Type":"ContainerDied","Data":"e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72"} Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.241958 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.307560 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/534b3e48-da2d-41b6-af02-bef43adcac21-logs\") pod \"534b3e48-da2d-41b6-af02-bef43adcac21\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.307654 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534b3e48-da2d-41b6-af02-bef43adcac21-combined-ca-bundle\") pod \"534b3e48-da2d-41b6-af02-bef43adcac21\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.307760 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/534b3e48-da2d-41b6-af02-bef43adcac21-config-data\") pod \"534b3e48-da2d-41b6-af02-bef43adcac21\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.307857 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxww9\" (UniqueName: \"kubernetes.io/projected/534b3e48-da2d-41b6-af02-bef43adcac21-kube-api-access-hxww9\") pod \"534b3e48-da2d-41b6-af02-bef43adcac21\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.308524 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/534b3e48-da2d-41b6-af02-bef43adcac21-logs" (OuterVolumeSpecName: "logs") pod "534b3e48-da2d-41b6-af02-bef43adcac21" (UID: "534b3e48-da2d-41b6-af02-bef43adcac21"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.310332 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/534b3e48-da2d-41b6-af02-bef43adcac21-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.315302 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/534b3e48-da2d-41b6-af02-bef43adcac21-kube-api-access-hxww9" (OuterVolumeSpecName: "kube-api-access-hxww9") pod "534b3e48-da2d-41b6-af02-bef43adcac21" (UID: "534b3e48-da2d-41b6-af02-bef43adcac21"). InnerVolumeSpecName "kube-api-access-hxww9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.334348 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/534b3e48-da2d-41b6-af02-bef43adcac21-config-data" (OuterVolumeSpecName: "config-data") pod "534b3e48-da2d-41b6-af02-bef43adcac21" (UID: "534b3e48-da2d-41b6-af02-bef43adcac21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.359231 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/534b3e48-da2d-41b6-af02-bef43adcac21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "534b3e48-da2d-41b6-af02-bef43adcac21" (UID: "534b3e48-da2d-41b6-af02-bef43adcac21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.412243 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/534b3e48-da2d-41b6-af02-bef43adcac21-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.412288 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxww9\" (UniqueName: \"kubernetes.io/projected/534b3e48-da2d-41b6-af02-bef43adcac21-kube-api-access-hxww9\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.412301 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534b3e48-da2d-41b6-af02-bef43adcac21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.507526 4837 generic.go:334] "Generic (PLEG): container finished" podID="534b3e48-da2d-41b6-af02-bef43adcac21" containerID="8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6" exitCode=0 Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.507573 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"534b3e48-da2d-41b6-af02-bef43adcac21","Type":"ContainerDied","Data":"8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6"} Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.507588 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.507613 4837 scope.go:117] "RemoveContainer" containerID="8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.507599 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"534b3e48-da2d-41b6-af02-bef43adcac21","Type":"ContainerDied","Data":"bcd98412fdb19b2343b2d9cf6ae91d9ffecdeb486a55232ac24c6fe9606a8706"} Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.545952 4837 scope.go:117] "RemoveContainer" containerID="723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.555146 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.569010 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.576909 4837 scope.go:117] "RemoveContainer" containerID="8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.579631 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 12:09:22 crc kubenswrapper[4837]: E0313 12:09:22.580099 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="534b3e48-da2d-41b6-af02-bef43adcac21" containerName="nova-api-api" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.580121 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="534b3e48-da2d-41b6-af02-bef43adcac21" containerName="nova-api-api" Mar 13 12:09:22 crc kubenswrapper[4837]: E0313 12:09:22.580143 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="534b3e48-da2d-41b6-af02-bef43adcac21" containerName="nova-api-log" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.580155 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="534b3e48-da2d-41b6-af02-bef43adcac21" containerName="nova-api-log" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.580414 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="534b3e48-da2d-41b6-af02-bef43adcac21" containerName="nova-api-api" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.580435 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="534b3e48-da2d-41b6-af02-bef43adcac21" containerName="nova-api-log" Mar 13 12:09:22 crc kubenswrapper[4837]: E0313 12:09:22.580803 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6\": container with ID starting with 8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6 not found: ID does not exist" containerID="8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.580855 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6"} err="failed to get container status \"8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6\": rpc error: code = NotFound desc = could not find container \"8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6\": container with ID starting with 8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6 not found: ID does not exist" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.580887 4837 scope.go:117] "RemoveContainer" containerID="723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.581678 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: E0313 12:09:22.582836 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a\": container with ID starting with 723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a not found: ID does not exist" containerID="723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.582872 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a"} err="failed to get container status \"723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a\": rpc error: code = NotFound desc = could not find container \"723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a\": container with ID starting with 723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a not found: ID does not exist" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.595236 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.623903 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.624521 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.624843 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.726114 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.726219 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.726265 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78c9774b-e6ed-434a-9a05-77de64d14c5c-logs\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.726305 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-config-data\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.726429 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-public-tls-certs\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.726496 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52nqk\" (UniqueName: \"kubernetes.io/projected/78c9774b-e6ed-434a-9a05-77de64d14c5c-kube-api-access-52nqk\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.828738 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-config-data\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.828896 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-public-tls-certs\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.828988 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52nqk\" (UniqueName: \"kubernetes.io/projected/78c9774b-e6ed-434a-9a05-77de64d14c5c-kube-api-access-52nqk\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.829043 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.829150 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.829258 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78c9774b-e6ed-434a-9a05-77de64d14c5c-logs\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.829687 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78c9774b-e6ed-434a-9a05-77de64d14c5c-logs\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.835522 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.835611 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-config-data\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.836082 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-public-tls-certs\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.837057 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.846112 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52nqk\" (UniqueName: \"kubernetes.io/projected/78c9774b-e6ed-434a-9a05-77de64d14c5c-kube-api-access-52nqk\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.962356 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:09:23 crc kubenswrapper[4837]: I0313 12:09:23.062905 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="534b3e48-da2d-41b6-af02-bef43adcac21" path="/var/lib/kubelet/pods/534b3e48-da2d-41b6-af02-bef43adcac21/volumes" Mar 13 12:09:23 crc kubenswrapper[4837]: I0313 12:09:23.439681 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:09:23 crc kubenswrapper[4837]: I0313 12:09:23.521252 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78c9774b-e6ed-434a-9a05-77de64d14c5c","Type":"ContainerStarted","Data":"196c93139204cac88ac74bf775fe6446d52e56b24f2532d6b6a8393e6ffd7da4"} Mar 13 12:09:24 crc kubenswrapper[4837]: I0313 12:09:24.532628 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78c9774b-e6ed-434a-9a05-77de64d14c5c","Type":"ContainerStarted","Data":"4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a"} Mar 13 12:09:24 crc kubenswrapper[4837]: I0313 12:09:24.532993 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78c9774b-e6ed-434a-9a05-77de64d14c5c","Type":"ContainerStarted","Data":"7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6"} Mar 13 12:09:24 crc kubenswrapper[4837]: I0313 12:09:24.552869 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.552847102 podStartE2EDuration="2.552847102s" podCreationTimestamp="2026-03-13 12:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:09:24.54773584 +0000 UTC m=+1280.186002603" watchObservedRunningTime="2026-03-13 12:09:24.552847102 +0000 UTC m=+1280.191113875" Mar 13 12:09:24 crc kubenswrapper[4837]: I0313 12:09:24.768525 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:24 crc kubenswrapper[4837]: I0313 12:09:24.788664 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.350603 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.490522 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-combined-ca-bundle\") pod \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.490804 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87f0825e-ff58-4bf4-bf83-6522dcc333e2-run-httpd\") pod \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.491151 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87f0825e-ff58-4bf4-bf83-6522dcc333e2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "87f0825e-ff58-4bf4-bf83-6522dcc333e2" (UID: "87f0825e-ff58-4bf4-bf83-6522dcc333e2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.491354 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-ceilometer-tls-certs\") pod \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.491799 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-sg-core-conf-yaml\") pod \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.491940 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87f0825e-ff58-4bf4-bf83-6522dcc333e2-log-httpd\") pod \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.492060 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-scripts\") pod \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.492525 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-config-data\") pod \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.492690 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzw9w\" (UniqueName: \"kubernetes.io/projected/87f0825e-ff58-4bf4-bf83-6522dcc333e2-kube-api-access-bzw9w\") pod \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.492461 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87f0825e-ff58-4bf4-bf83-6522dcc333e2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "87f0825e-ff58-4bf4-bf83-6522dcc333e2" (UID: "87f0825e-ff58-4bf4-bf83-6522dcc333e2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.494098 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87f0825e-ff58-4bf4-bf83-6522dcc333e2-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.494189 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87f0825e-ff58-4bf4-bf83-6522dcc333e2-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.505712 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87f0825e-ff58-4bf4-bf83-6522dcc333e2-kube-api-access-bzw9w" (OuterVolumeSpecName: "kube-api-access-bzw9w") pod "87f0825e-ff58-4bf4-bf83-6522dcc333e2" (UID: "87f0825e-ff58-4bf4-bf83-6522dcc333e2"). InnerVolumeSpecName "kube-api-access-bzw9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.506908 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-scripts" (OuterVolumeSpecName: "scripts") pod "87f0825e-ff58-4bf4-bf83-6522dcc333e2" (UID: "87f0825e-ff58-4bf4-bf83-6522dcc333e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.529858 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "87f0825e-ff58-4bf4-bf83-6522dcc333e2" (UID: "87f0825e-ff58-4bf4-bf83-6522dcc333e2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.545945 4837 generic.go:334] "Generic (PLEG): container finished" podID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerID="26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88" exitCode=0 Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.545993 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87f0825e-ff58-4bf4-bf83-6522dcc333e2","Type":"ContainerDied","Data":"26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88"} Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.546039 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87f0825e-ff58-4bf4-bf83-6522dcc333e2","Type":"ContainerDied","Data":"8371eb3af69417b51c9486141ddff58bfc7ec752522d7b166877385a6b1e772e"} Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.546060 4837 scope.go:117] "RemoveContainer" containerID="770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.546095 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.553586 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "87f0825e-ff58-4bf4-bf83-6522dcc333e2" (UID: "87f0825e-ff58-4bf4-bf83-6522dcc333e2"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.564502 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.606802 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.606839 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.606850 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzw9w\" (UniqueName: \"kubernetes.io/projected/87f0825e-ff58-4bf4-bf83-6522dcc333e2-kube-api-access-bzw9w\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.606859 4837 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.606928 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87f0825e-ff58-4bf4-bf83-6522dcc333e2" (UID: "87f0825e-ff58-4bf4-bf83-6522dcc333e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.624727 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-config-data" (OuterVolumeSpecName: "config-data") pod "87f0825e-ff58-4bf4-bf83-6522dcc333e2" (UID: "87f0825e-ff58-4bf4-bf83-6522dcc333e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.662762 4837 scope.go:117] "RemoveContainer" containerID="0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.684327 4837 scope.go:117] "RemoveContainer" containerID="e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.708679 4837 scope.go:117] "RemoveContainer" containerID="26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.710229 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.710279 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.731980 4837 scope.go:117] "RemoveContainer" containerID="770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8" Mar 13 12:09:25 crc kubenswrapper[4837]: E0313 12:09:25.733906 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8\": container with ID starting with 770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8 not found: ID does not exist" containerID="770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.733942 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8"} err="failed to get container status \"770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8\": rpc error: code = NotFound desc = could not find container \"770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8\": container with ID starting with 770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8 not found: ID does not exist" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.733965 4837 scope.go:117] "RemoveContainer" containerID="0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1" Mar 13 12:09:25 crc kubenswrapper[4837]: E0313 12:09:25.734437 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1\": container with ID starting with 0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1 not found: ID does not exist" containerID="0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.734475 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1"} err="failed to get container status \"0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1\": rpc error: code = NotFound desc = could not find container \"0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1\": container with ID starting with 0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1 not found: ID does not exist" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.734494 4837 scope.go:117] "RemoveContainer" containerID="e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72" Mar 13 12:09:25 crc kubenswrapper[4837]: E0313 12:09:25.735169 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72\": container with ID starting with e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72 not found: ID does not exist" containerID="e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.735240 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72"} err="failed to get container status \"e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72\": rpc error: code = NotFound desc = could not find container \"e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72\": container with ID starting with e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72 not found: ID does not exist" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.735285 4837 scope.go:117] "RemoveContainer" containerID="26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88" Mar 13 12:09:25 crc kubenswrapper[4837]: E0313 12:09:25.735692 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88\": container with ID starting with 26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88 not found: ID does not exist" containerID="26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.735720 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88"} err="failed to get container status \"26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88\": rpc error: code = NotFound desc = could not find container \"26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88\": container with ID starting with 26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88 not found: ID does not exist" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.826384 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-mzmd5"] Mar 13 12:09:25 crc kubenswrapper[4837]: E0313 12:09:25.827133 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="proxy-httpd" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.827157 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="proxy-httpd" Mar 13 12:09:25 crc kubenswrapper[4837]: E0313 12:09:25.827284 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="ceilometer-central-agent" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.827299 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="ceilometer-central-agent" Mar 13 12:09:25 crc kubenswrapper[4837]: E0313 12:09:25.827318 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="ceilometer-notification-agent" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.827327 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="ceilometer-notification-agent" Mar 13 12:09:25 crc kubenswrapper[4837]: E0313 12:09:25.827342 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="sg-core" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.827350 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="sg-core" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.827673 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="ceilometer-notification-agent" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.827703 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="proxy-httpd" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.827722 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="ceilometer-central-agent" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.828950 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="sg-core" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.830403 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.833120 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.833361 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.857113 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mzmd5"] Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.915216 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-scripts\") pod \"nova-cell1-cell-mapping-mzmd5\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.915513 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-config-data\") pod \"nova-cell1-cell-mapping-mzmd5\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.915594 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mzmd5\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.915669 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctw8c\" (UniqueName: \"kubernetes.io/projected/f0f45aae-caa3-4c50-9059-be42d328cba1-kube-api-access-ctw8c\") pod \"nova-cell1-cell-mapping-mzmd5\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.949564 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.958745 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.986941 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.990339 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.994094 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.995105 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:25.999987 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.013727 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.017882 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctw8c\" (UniqueName: \"kubernetes.io/projected/f0f45aae-caa3-4c50-9059-be42d328cba1-kube-api-access-ctw8c\") pod \"nova-cell1-cell-mapping-mzmd5\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.018015 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-scripts\") pod \"nova-cell1-cell-mapping-mzmd5\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.018174 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-config-data\") pod \"nova-cell1-cell-mapping-mzmd5\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.018271 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mzmd5\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.025773 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-config-data\") pod \"nova-cell1-cell-mapping-mzmd5\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.026725 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mzmd5\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.029816 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-scripts\") pod \"nova-cell1-cell-mapping-mzmd5\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.043036 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctw8c\" (UniqueName: \"kubernetes.io/projected/f0f45aae-caa3-4c50-9059-be42d328cba1-kube-api-access-ctw8c\") pod \"nova-cell1-cell-mapping-mzmd5\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.119988 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.120068 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.120129 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-config-data\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.120173 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdwkd\" (UniqueName: \"kubernetes.io/projected/82b5b509-a674-4a89-a7cc-c01c7bfca144-kube-api-access-wdwkd\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.120280 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b5b509-a674-4a89-a7cc-c01c7bfca144-log-httpd\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.120375 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.120408 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b5b509-a674-4a89-a7cc-c01c7bfca144-run-httpd\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.120450 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-scripts\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.173161 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.222803 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.222875 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b5b509-a674-4a89-a7cc-c01c7bfca144-run-httpd\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.222920 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-scripts\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.222947 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.222979 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.223038 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-config-data\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.223072 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdwkd\" (UniqueName: \"kubernetes.io/projected/82b5b509-a674-4a89-a7cc-c01c7bfca144-kube-api-access-wdwkd\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.223146 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b5b509-a674-4a89-a7cc-c01c7bfca144-log-httpd\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.223815 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b5b509-a674-4a89-a7cc-c01c7bfca144-log-httpd\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.224081 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b5b509-a674-4a89-a7cc-c01c7bfca144-run-httpd\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.232483 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.232999 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-scripts\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.233124 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.234090 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-config-data\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.234334 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.247139 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdwkd\" (UniqueName: \"kubernetes.io/projected/82b5b509-a674-4a89-a7cc-c01c7bfca144-kube-api-access-wdwkd\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.311723 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.384007 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.456334 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-5blpv"] Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.457712 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" podUID="6de330b6-0bbb-4a9d-9062-9c7ed182a189" containerName="dnsmasq-dns" containerID="cri-o://1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc" gracePeriod=10 Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.738019 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mzmd5"] Mar 13 12:09:26 crc kubenswrapper[4837]: W0313 12:09:26.752277 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0f45aae_caa3_4c50_9059_be42d328cba1.slice/crio-169405adf02892a1cf5eb78b7101a3517b5c16d690364e5a55dd374130c20692 WatchSource:0}: Error finding container 169405adf02892a1cf5eb78b7101a3517b5c16d690364e5a55dd374130c20692: Status 404 returned error can't find the container with id 169405adf02892a1cf5eb78b7101a3517b5c16d690364e5a55dd374130c20692 Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.969096 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.108788 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" path="/var/lib/kubelet/pods/87f0825e-ff58-4bf4-bf83-6522dcc333e2/volumes" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.150168 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.252185 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-dns-swift-storage-0\") pod \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.252318 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-ovsdbserver-nb\") pod \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.252422 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-config\") pod \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.252678 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-ovsdbserver-sb\") pod \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.252759 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-dns-svc\") pod \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.252896 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gvxg\" (UniqueName: \"kubernetes.io/projected/6de330b6-0bbb-4a9d-9062-9c7ed182a189-kube-api-access-7gvxg\") pod \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.279766 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6de330b6-0bbb-4a9d-9062-9c7ed182a189-kube-api-access-7gvxg" (OuterVolumeSpecName: "kube-api-access-7gvxg") pod "6de330b6-0bbb-4a9d-9062-9c7ed182a189" (UID: "6de330b6-0bbb-4a9d-9062-9c7ed182a189"). InnerVolumeSpecName "kube-api-access-7gvxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.322334 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6de330b6-0bbb-4a9d-9062-9c7ed182a189" (UID: "6de330b6-0bbb-4a9d-9062-9c7ed182a189"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.332176 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6de330b6-0bbb-4a9d-9062-9c7ed182a189" (UID: "6de330b6-0bbb-4a9d-9062-9c7ed182a189"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.337713 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6de330b6-0bbb-4a9d-9062-9c7ed182a189" (UID: "6de330b6-0bbb-4a9d-9062-9c7ed182a189"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.343365 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6de330b6-0bbb-4a9d-9062-9c7ed182a189" (UID: "6de330b6-0bbb-4a9d-9062-9c7ed182a189"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.343433 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-config" (OuterVolumeSpecName: "config") pod "6de330b6-0bbb-4a9d-9062-9c7ed182a189" (UID: "6de330b6-0bbb-4a9d-9062-9c7ed182a189"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.355759 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.355799 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.355811 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.355826 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.355838 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.355849 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gvxg\" (UniqueName: \"kubernetes.io/projected/6de330b6-0bbb-4a9d-9062-9c7ed182a189-kube-api-access-7gvxg\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.572628 4837 generic.go:334] "Generic (PLEG): container finished" podID="6de330b6-0bbb-4a9d-9062-9c7ed182a189" containerID="1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc" exitCode=0 Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.572727 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" event={"ID":"6de330b6-0bbb-4a9d-9062-9c7ed182a189","Type":"ContainerDied","Data":"1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc"} Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.572761 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" event={"ID":"6de330b6-0bbb-4a9d-9062-9c7ed182a189","Type":"ContainerDied","Data":"d820b1edec0c5d2d420936bab95dffbf9bd4c7adef7db33d312ced4b311526ff"} Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.572768 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.572781 4837 scope.go:117] "RemoveContainer" containerID="1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.575341 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mzmd5" event={"ID":"f0f45aae-caa3-4c50-9059-be42d328cba1","Type":"ContainerStarted","Data":"e540ca1787fcba1ed1f9804f4336a11c9388c115ed0bc76404d559071e68ab56"} Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.576205 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mzmd5" event={"ID":"f0f45aae-caa3-4c50-9059-be42d328cba1","Type":"ContainerStarted","Data":"169405adf02892a1cf5eb78b7101a3517b5c16d690364e5a55dd374130c20692"} Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.583172 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b5b509-a674-4a89-a7cc-c01c7bfca144","Type":"ContainerStarted","Data":"7d1984717c670e6c61e1d51e4ec9299f3dd2cfc5f7b44bf67bbe7d72918d5c6e"} Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.593685 4837 scope.go:117] "RemoveContainer" containerID="7468313c118293c73f68950b41a915eb07c6510dd5985dea1ec55106483d1ae9" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.621456 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-mzmd5" podStartSLOduration=2.621433569 podStartE2EDuration="2.621433569s" podCreationTimestamp="2026-03-13 12:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:09:27.606471077 +0000 UTC m=+1283.244737850" watchObservedRunningTime="2026-03-13 12:09:27.621433569 +0000 UTC m=+1283.259700342" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.636117 4837 scope.go:117] "RemoveContainer" containerID="1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc" Mar 13 12:09:27 crc kubenswrapper[4837]: E0313 12:09:27.639832 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc\": container with ID starting with 1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc not found: ID does not exist" containerID="1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.639920 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc"} err="failed to get container status \"1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc\": rpc error: code = NotFound desc = could not find container \"1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc\": container with ID starting with 1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc not found: ID does not exist" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.640009 4837 scope.go:117] "RemoveContainer" containerID="7468313c118293c73f68950b41a915eb07c6510dd5985dea1ec55106483d1ae9" Mar 13 12:09:27 crc kubenswrapper[4837]: E0313 12:09:27.640772 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7468313c118293c73f68950b41a915eb07c6510dd5985dea1ec55106483d1ae9\": container with ID starting with 7468313c118293c73f68950b41a915eb07c6510dd5985dea1ec55106483d1ae9 not found: ID does not exist" containerID="7468313c118293c73f68950b41a915eb07c6510dd5985dea1ec55106483d1ae9" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.640840 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7468313c118293c73f68950b41a915eb07c6510dd5985dea1ec55106483d1ae9"} err="failed to get container status \"7468313c118293c73f68950b41a915eb07c6510dd5985dea1ec55106483d1ae9\": rpc error: code = NotFound desc = could not find container \"7468313c118293c73f68950b41a915eb07c6510dd5985dea1ec55106483d1ae9\": container with ID starting with 7468313c118293c73f68950b41a915eb07c6510dd5985dea1ec55106483d1ae9 not found: ID does not exist" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.652658 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-5blpv"] Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.663595 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-5blpv"] Mar 13 12:09:28 crc kubenswrapper[4837]: I0313 12:09:28.596674 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b5b509-a674-4a89-a7cc-c01c7bfca144","Type":"ContainerStarted","Data":"bbfcf746609a946bc3db7fc627e745ae9e9768a0d134e4e98791513fbddaa72d"} Mar 13 12:09:28 crc kubenswrapper[4837]: I0313 12:09:28.597052 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b5b509-a674-4a89-a7cc-c01c7bfca144","Type":"ContainerStarted","Data":"045eef85cb2e6b2e1ec2f3cbd2e8715b1219b97a406f5894bd80115a5a961db4"} Mar 13 12:09:29 crc kubenswrapper[4837]: I0313 12:09:29.063298 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6de330b6-0bbb-4a9d-9062-9c7ed182a189" path="/var/lib/kubelet/pods/6de330b6-0bbb-4a9d-9062-9c7ed182a189/volumes" Mar 13 12:09:29 crc kubenswrapper[4837]: I0313 12:09:29.611073 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b5b509-a674-4a89-a7cc-c01c7bfca144","Type":"ContainerStarted","Data":"9dcde2302697f768971d264413f868731fa4ac41393b876ece3a51c664182c72"} Mar 13 12:09:31 crc kubenswrapper[4837]: I0313 12:09:31.630663 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b5b509-a674-4a89-a7cc-c01c7bfca144","Type":"ContainerStarted","Data":"2e4fc926acf9cb684f3794b18592198042298af8a9d1176684449adb0020a337"} Mar 13 12:09:31 crc kubenswrapper[4837]: I0313 12:09:31.631185 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 12:09:31 crc kubenswrapper[4837]: I0313 12:09:31.659963 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.424966887 podStartE2EDuration="6.659937463s" podCreationTimestamp="2026-03-13 12:09:25 +0000 UTC" firstStartedPulling="2026-03-13 12:09:27.059493605 +0000 UTC m=+1282.697760378" lastFinishedPulling="2026-03-13 12:09:30.294464191 +0000 UTC m=+1285.932730954" observedRunningTime="2026-03-13 12:09:31.651287921 +0000 UTC m=+1287.289554694" watchObservedRunningTime="2026-03-13 12:09:31.659937463 +0000 UTC m=+1287.298204226" Mar 13 12:09:32 crc kubenswrapper[4837]: I0313 12:09:32.640884 4837 generic.go:334] "Generic (PLEG): container finished" podID="f0f45aae-caa3-4c50-9059-be42d328cba1" containerID="e540ca1787fcba1ed1f9804f4336a11c9388c115ed0bc76404d559071e68ab56" exitCode=0 Mar 13 12:09:32 crc kubenswrapper[4837]: I0313 12:09:32.640953 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mzmd5" event={"ID":"f0f45aae-caa3-4c50-9059-be42d328cba1","Type":"ContainerDied","Data":"e540ca1787fcba1ed1f9804f4336a11c9388c115ed0bc76404d559071e68ab56"} Mar 13 12:09:32 crc kubenswrapper[4837]: I0313 12:09:32.963394 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 12:09:32 crc kubenswrapper[4837]: I0313 12:09:32.963447 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 12:09:33 crc kubenswrapper[4837]: I0313 12:09:33.986020 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="78c9774b-e6ed-434a-9a05-77de64d14c5c" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 12:09:33 crc kubenswrapper[4837]: I0313 12:09:33.986037 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="78c9774b-e6ed-434a-9a05-77de64d14c5c" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.102774 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.202876 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-combined-ca-bundle\") pod \"f0f45aae-caa3-4c50-9059-be42d328cba1\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.203022 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-config-data\") pod \"f0f45aae-caa3-4c50-9059-be42d328cba1\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.203076 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctw8c\" (UniqueName: \"kubernetes.io/projected/f0f45aae-caa3-4c50-9059-be42d328cba1-kube-api-access-ctw8c\") pod \"f0f45aae-caa3-4c50-9059-be42d328cba1\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.203224 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-scripts\") pod \"f0f45aae-caa3-4c50-9059-be42d328cba1\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.223231 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0f45aae-caa3-4c50-9059-be42d328cba1-kube-api-access-ctw8c" (OuterVolumeSpecName: "kube-api-access-ctw8c") pod "f0f45aae-caa3-4c50-9059-be42d328cba1" (UID: "f0f45aae-caa3-4c50-9059-be42d328cba1"). InnerVolumeSpecName "kube-api-access-ctw8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.224385 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-scripts" (OuterVolumeSpecName: "scripts") pod "f0f45aae-caa3-4c50-9059-be42d328cba1" (UID: "f0f45aae-caa3-4c50-9059-be42d328cba1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.237818 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0f45aae-caa3-4c50-9059-be42d328cba1" (UID: "f0f45aae-caa3-4c50-9059-be42d328cba1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.243873 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-config-data" (OuterVolumeSpecName: "config-data") pod "f0f45aae-caa3-4c50-9059-be42d328cba1" (UID: "f0f45aae-caa3-4c50-9059-be42d328cba1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.308002 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.308290 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.308359 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctw8c\" (UniqueName: \"kubernetes.io/projected/f0f45aae-caa3-4c50-9059-be42d328cba1-kube-api-access-ctw8c\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.308417 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.664472 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mzmd5" event={"ID":"f0f45aae-caa3-4c50-9059-be42d328cba1","Type":"ContainerDied","Data":"169405adf02892a1cf5eb78b7101a3517b5c16d690364e5a55dd374130c20692"} Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.664733 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="169405adf02892a1cf5eb78b7101a3517b5c16d690364e5a55dd374130c20692" Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.664884 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.860832 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.861093 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4cc7473d-2608-4989-990f-a19d70e8a3a3" containerName="nova-scheduler-scheduler" containerID="cri-o://84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7" gracePeriod=30 Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.883346 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.883667 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="78c9774b-e6ed-434a-9a05-77de64d14c5c" containerName="nova-api-log" containerID="cri-o://7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6" gracePeriod=30 Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.884154 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="78c9774b-e6ed-434a-9a05-77de64d14c5c" containerName="nova-api-api" containerID="cri-o://4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a" gracePeriod=30 Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.899354 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.899859 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" containerName="nova-metadata-log" containerID="cri-o://9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98" gracePeriod=30 Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.900077 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" containerName="nova-metadata-metadata" containerID="cri-o://456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e" gracePeriod=30 Mar 13 12:09:35 crc kubenswrapper[4837]: E0313 12:09:35.633166 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 12:09:35 crc kubenswrapper[4837]: E0313 12:09:35.636116 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 12:09:35 crc kubenswrapper[4837]: E0313 12:09:35.637410 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 12:09:35 crc kubenswrapper[4837]: E0313 12:09:35.637451 4837 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="4cc7473d-2608-4989-990f-a19d70e8a3a3" containerName="nova-scheduler-scheduler" Mar 13 12:09:35 crc kubenswrapper[4837]: I0313 12:09:35.674866 4837 generic.go:334] "Generic (PLEG): container finished" podID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" containerID="9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98" exitCode=143 Mar 13 12:09:35 crc kubenswrapper[4837]: I0313 12:09:35.674951 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43","Type":"ContainerDied","Data":"9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98"} Mar 13 12:09:35 crc kubenswrapper[4837]: I0313 12:09:35.676980 4837 generic.go:334] "Generic (PLEG): container finished" podID="78c9774b-e6ed-434a-9a05-77de64d14c5c" containerID="7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6" exitCode=143 Mar 13 12:09:35 crc kubenswrapper[4837]: I0313 12:09:35.677038 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78c9774b-e6ed-434a-9a05-77de64d14c5c","Type":"ContainerDied","Data":"7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6"} Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.505726 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.589883 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-config-data\") pod \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.590053 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-combined-ca-bundle\") pod \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.590117 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7skk9\" (UniqueName: \"kubernetes.io/projected/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-kube-api-access-7skk9\") pod \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.590816 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-nova-metadata-tls-certs\") pod \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.591177 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-logs\") pod \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.591880 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-logs" (OuterVolumeSpecName: "logs") pod "5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" (UID: "5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.598267 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-kube-api-access-7skk9" (OuterVolumeSpecName: "kube-api-access-7skk9") pod "5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" (UID: "5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43"). InnerVolumeSpecName "kube-api-access-7skk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.619946 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-config-data" (OuterVolumeSpecName: "config-data") pod "5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" (UID: "5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.621882 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" (UID: "5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.640090 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" (UID: "5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.692987 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.693022 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7skk9\" (UniqueName: \"kubernetes.io/projected/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-kube-api-access-7skk9\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.693033 4837 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.693044 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.693053 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.704281 4837 generic.go:334] "Generic (PLEG): container finished" podID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" containerID="456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e" exitCode=0 Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.704335 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43","Type":"ContainerDied","Data":"456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e"} Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.704368 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43","Type":"ContainerDied","Data":"c2ff21ee05eb4c0cd65e5feb281a54f68d478fb493c97488ec3ad06bbc0f4880"} Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.704391 4837 scope.go:117] "RemoveContainer" containerID="456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.704531 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.739427 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.740718 4837 scope.go:117] "RemoveContainer" containerID="9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.751112 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.759306 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:09:38 crc kubenswrapper[4837]: E0313 12:09:38.759790 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6de330b6-0bbb-4a9d-9062-9c7ed182a189" containerName="dnsmasq-dns" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.759805 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6de330b6-0bbb-4a9d-9062-9c7ed182a189" containerName="dnsmasq-dns" Mar 13 12:09:38 crc kubenswrapper[4837]: E0313 12:09:38.759828 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" containerName="nova-metadata-metadata" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.759834 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" containerName="nova-metadata-metadata" Mar 13 12:09:38 crc kubenswrapper[4837]: E0313 12:09:38.759844 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6de330b6-0bbb-4a9d-9062-9c7ed182a189" containerName="init" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.759850 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6de330b6-0bbb-4a9d-9062-9c7ed182a189" containerName="init" Mar 13 12:09:38 crc kubenswrapper[4837]: E0313 12:09:38.759863 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0f45aae-caa3-4c50-9059-be42d328cba1" containerName="nova-manage" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.759869 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0f45aae-caa3-4c50-9059-be42d328cba1" containerName="nova-manage" Mar 13 12:09:38 crc kubenswrapper[4837]: E0313 12:09:38.759897 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" containerName="nova-metadata-log" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.759903 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" containerName="nova-metadata-log" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.760070 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" containerName="nova-metadata-metadata" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.760080 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" containerName="nova-metadata-log" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.760088 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0f45aae-caa3-4c50-9059-be42d328cba1" containerName="nova-manage" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.760111 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6de330b6-0bbb-4a9d-9062-9c7ed182a189" containerName="dnsmasq-dns" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.761125 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.768571 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.781980 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.790706 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.791272 4837 scope.go:117] "RemoveContainer" containerID="456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e" Mar 13 12:09:38 crc kubenswrapper[4837]: E0313 12:09:38.800706 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e\": container with ID starting with 456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e not found: ID does not exist" containerID="456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.800866 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e"} err="failed to get container status \"456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e\": rpc error: code = NotFound desc = could not find container \"456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e\": container with ID starting with 456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e not found: ID does not exist" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.800901 4837 scope.go:117] "RemoveContainer" containerID="9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98" Mar 13 12:09:38 crc kubenswrapper[4837]: E0313 12:09:38.801505 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98\": container with ID starting with 9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98 not found: ID does not exist" containerID="9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.801621 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98"} err="failed to get container status \"9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98\": rpc error: code = NotFound desc = could not find container \"9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98\": container with ID starting with 9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98 not found: ID does not exist" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.914814 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7faa5418-aa48-4e20-830c-bb171cfea0d9-logs\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.914909 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7faa5418-aa48-4e20-830c-bb171cfea0d9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.915018 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7faa5418-aa48-4e20-830c-bb171cfea0d9-config-data\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.915072 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rwv5\" (UniqueName: \"kubernetes.io/projected/7faa5418-aa48-4e20-830c-bb171cfea0d9-kube-api-access-9rwv5\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.915136 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7faa5418-aa48-4e20-830c-bb171cfea0d9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.016756 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7faa5418-aa48-4e20-830c-bb171cfea0d9-config-data\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.016843 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rwv5\" (UniqueName: \"kubernetes.io/projected/7faa5418-aa48-4e20-830c-bb171cfea0d9-kube-api-access-9rwv5\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.016907 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7faa5418-aa48-4e20-830c-bb171cfea0d9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.016952 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7faa5418-aa48-4e20-830c-bb171cfea0d9-logs\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.017028 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7faa5418-aa48-4e20-830c-bb171cfea0d9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.017441 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7faa5418-aa48-4e20-830c-bb171cfea0d9-logs\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.020680 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7faa5418-aa48-4e20-830c-bb171cfea0d9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.020731 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7faa5418-aa48-4e20-830c-bb171cfea0d9-config-data\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.021311 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7faa5418-aa48-4e20-830c-bb171cfea0d9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.040325 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rwv5\" (UniqueName: \"kubernetes.io/projected/7faa5418-aa48-4e20-830c-bb171cfea0d9-kube-api-access-9rwv5\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.063047 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" path="/var/lib/kubelet/pods/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43/volumes" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.090768 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:09:39 crc kubenswrapper[4837]: W0313 12:09:39.548023 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7faa5418_aa48_4e20_830c_bb171cfea0d9.slice/crio-b1cb476eb0c804299feb94db952778382ba43003eedec8e33c2f2553f024b174 WatchSource:0}: Error finding container b1cb476eb0c804299feb94db952778382ba43003eedec8e33c2f2553f024b174: Status 404 returned error can't find the container with id b1cb476eb0c804299feb94db952778382ba43003eedec8e33c2f2553f024b174 Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.559224 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.671393 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.719007 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.742717 4837 generic.go:334] "Generic (PLEG): container finished" podID="78c9774b-e6ed-434a-9a05-77de64d14c5c" containerID="4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a" exitCode=0 Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.742791 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78c9774b-e6ed-434a-9a05-77de64d14c5c","Type":"ContainerDied","Data":"4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a"} Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.742820 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78c9774b-e6ed-434a-9a05-77de64d14c5c","Type":"ContainerDied","Data":"196c93139204cac88ac74bf775fe6446d52e56b24f2532d6b6a8393e6ffd7da4"} Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.742836 4837 scope.go:117] "RemoveContainer" containerID="4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.742832 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.743939 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cc7473d-2608-4989-990f-a19d70e8a3a3-combined-ca-bundle\") pod \"4cc7473d-2608-4989-990f-a19d70e8a3a3\" (UID: \"4cc7473d-2608-4989-990f-a19d70e8a3a3\") " Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.744135 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cc7473d-2608-4989-990f-a19d70e8a3a3-config-data\") pod \"4cc7473d-2608-4989-990f-a19d70e8a3a3\" (UID: \"4cc7473d-2608-4989-990f-a19d70e8a3a3\") " Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.744231 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xqdx\" (UniqueName: \"kubernetes.io/projected/4cc7473d-2608-4989-990f-a19d70e8a3a3-kube-api-access-7xqdx\") pod \"4cc7473d-2608-4989-990f-a19d70e8a3a3\" (UID: \"4cc7473d-2608-4989-990f-a19d70e8a3a3\") " Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.749478 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7faa5418-aa48-4e20-830c-bb171cfea0d9","Type":"ContainerStarted","Data":"b1cb476eb0c804299feb94db952778382ba43003eedec8e33c2f2553f024b174"} Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.755137 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cc7473d-2608-4989-990f-a19d70e8a3a3-kube-api-access-7xqdx" (OuterVolumeSpecName: "kube-api-access-7xqdx") pod "4cc7473d-2608-4989-990f-a19d70e8a3a3" (UID: "4cc7473d-2608-4989-990f-a19d70e8a3a3"). InnerVolumeSpecName "kube-api-access-7xqdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.755533 4837 generic.go:334] "Generic (PLEG): container finished" podID="4cc7473d-2608-4989-990f-a19d70e8a3a3" containerID="84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7" exitCode=0 Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.755572 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4cc7473d-2608-4989-990f-a19d70e8a3a3","Type":"ContainerDied","Data":"84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7"} Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.755583 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.755598 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4cc7473d-2608-4989-990f-a19d70e8a3a3","Type":"ContainerDied","Data":"7499f46006750dbda5f6e6ebf4a383aa5e9cd8afe4efcdd2e54c6f88540d0454"} Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.770906 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cc7473d-2608-4989-990f-a19d70e8a3a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cc7473d-2608-4989-990f-a19d70e8a3a3" (UID: "4cc7473d-2608-4989-990f-a19d70e8a3a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.773537 4837 scope.go:117] "RemoveContainer" containerID="7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.778316 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cc7473d-2608-4989-990f-a19d70e8a3a3-config-data" (OuterVolumeSpecName: "config-data") pod "4cc7473d-2608-4989-990f-a19d70e8a3a3" (UID: "4cc7473d-2608-4989-990f-a19d70e8a3a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.791965 4837 scope.go:117] "RemoveContainer" containerID="4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a" Mar 13 12:09:39 crc kubenswrapper[4837]: E0313 12:09:39.792746 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a\": container with ID starting with 4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a not found: ID does not exist" containerID="4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.792791 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a"} err="failed to get container status \"4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a\": rpc error: code = NotFound desc = could not find container \"4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a\": container with ID starting with 4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a not found: ID does not exist" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.792824 4837 scope.go:117] "RemoveContainer" containerID="7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6" Mar 13 12:09:39 crc kubenswrapper[4837]: E0313 12:09:39.793222 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6\": container with ID starting with 7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6 not found: ID does not exist" containerID="7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.793434 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6"} err="failed to get container status \"7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6\": rpc error: code = NotFound desc = could not find container \"7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6\": container with ID starting with 7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6 not found: ID does not exist" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.793584 4837 scope.go:117] "RemoveContainer" containerID="84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.821030 4837 scope.go:117] "RemoveContainer" containerID="84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7" Mar 13 12:09:39 crc kubenswrapper[4837]: E0313 12:09:39.821520 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7\": container with ID starting with 84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7 not found: ID does not exist" containerID="84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.821653 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7"} err="failed to get container status \"84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7\": rpc error: code = NotFound desc = could not find container \"84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7\": container with ID starting with 84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7 not found: ID does not exist" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.846099 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52nqk\" (UniqueName: \"kubernetes.io/projected/78c9774b-e6ed-434a-9a05-77de64d14c5c-kube-api-access-52nqk\") pod \"78c9774b-e6ed-434a-9a05-77de64d14c5c\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.846259 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-public-tls-certs\") pod \"78c9774b-e6ed-434a-9a05-77de64d14c5c\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.846331 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78c9774b-e6ed-434a-9a05-77de64d14c5c-logs\") pod \"78c9774b-e6ed-434a-9a05-77de64d14c5c\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.846536 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-config-data\") pod \"78c9774b-e6ed-434a-9a05-77de64d14c5c\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.846693 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-combined-ca-bundle\") pod \"78c9774b-e6ed-434a-9a05-77de64d14c5c\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.846908 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78c9774b-e6ed-434a-9a05-77de64d14c5c-logs" (OuterVolumeSpecName: "logs") pod "78c9774b-e6ed-434a-9a05-77de64d14c5c" (UID: "78c9774b-e6ed-434a-9a05-77de64d14c5c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.847027 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-internal-tls-certs\") pod \"78c9774b-e6ed-434a-9a05-77de64d14c5c\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.847549 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cc7473d-2608-4989-990f-a19d70e8a3a3-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.847630 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xqdx\" (UniqueName: \"kubernetes.io/projected/4cc7473d-2608-4989-990f-a19d70e8a3a3-kube-api-access-7xqdx\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.847764 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cc7473d-2608-4989-990f-a19d70e8a3a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.851062 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78c9774b-e6ed-434a-9a05-77de64d14c5c-kube-api-access-52nqk" (OuterVolumeSpecName: "kube-api-access-52nqk") pod "78c9774b-e6ed-434a-9a05-77de64d14c5c" (UID: "78c9774b-e6ed-434a-9a05-77de64d14c5c"). InnerVolumeSpecName "kube-api-access-52nqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.881481 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-config-data" (OuterVolumeSpecName: "config-data") pod "78c9774b-e6ed-434a-9a05-77de64d14c5c" (UID: "78c9774b-e6ed-434a-9a05-77de64d14c5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.886473 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78c9774b-e6ed-434a-9a05-77de64d14c5c" (UID: "78c9774b-e6ed-434a-9a05-77de64d14c5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.909920 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "78c9774b-e6ed-434a-9a05-77de64d14c5c" (UID: "78c9774b-e6ed-434a-9a05-77de64d14c5c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.915835 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "78c9774b-e6ed-434a-9a05-77de64d14c5c" (UID: "78c9774b-e6ed-434a-9a05-77de64d14c5c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.949788 4837 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.949864 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78c9774b-e6ed-434a-9a05-77de64d14c5c-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.949877 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.949918 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.949931 4837 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.949942 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52nqk\" (UniqueName: \"kubernetes.io/projected/78c9774b-e6ed-434a-9a05-77de64d14c5c-kube-api-access-52nqk\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.078523 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.087390 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.100574 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.109384 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.124449 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 12:09:40 crc kubenswrapper[4837]: E0313 12:09:40.124943 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cc7473d-2608-4989-990f-a19d70e8a3a3" containerName="nova-scheduler-scheduler" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.124967 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cc7473d-2608-4989-990f-a19d70e8a3a3" containerName="nova-scheduler-scheduler" Mar 13 12:09:40 crc kubenswrapper[4837]: E0313 12:09:40.124992 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78c9774b-e6ed-434a-9a05-77de64d14c5c" containerName="nova-api-api" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.125001 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c9774b-e6ed-434a-9a05-77de64d14c5c" containerName="nova-api-api" Mar 13 12:09:40 crc kubenswrapper[4837]: E0313 12:09:40.125019 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78c9774b-e6ed-434a-9a05-77de64d14c5c" containerName="nova-api-log" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.125028 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c9774b-e6ed-434a-9a05-77de64d14c5c" containerName="nova-api-log" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.125243 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cc7473d-2608-4989-990f-a19d70e8a3a3" containerName="nova-scheduler-scheduler" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.125269 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="78c9774b-e6ed-434a-9a05-77de64d14c5c" containerName="nova-api-api" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.125285 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="78c9774b-e6ed-434a-9a05-77de64d14c5c" containerName="nova-api-log" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.126443 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.140499 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.140773 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.140919 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.141814 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.144817 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.145900 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.149761 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.170328 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.253974 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e6cd1d9-f670-4e94-8322-44e471c3be71-logs\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.254257 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e6cd1d9-f670-4e94-8322-44e471c3be71-public-tls-certs\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.254319 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d380e047-7297-4835-b948-6c86c6b6aa27-config-data\") pod \"nova-scheduler-0\" (UID: \"d380e047-7297-4835-b948-6c86c6b6aa27\") " pod="openstack/nova-scheduler-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.254350 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e6cd1d9-f670-4e94-8322-44e471c3be71-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.254621 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs59c\" (UniqueName: \"kubernetes.io/projected/4e6cd1d9-f670-4e94-8322-44e471c3be71-kube-api-access-gs59c\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.254775 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d380e047-7297-4835-b948-6c86c6b6aa27-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d380e047-7297-4835-b948-6c86c6b6aa27\") " pod="openstack/nova-scheduler-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.254821 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6pv7\" (UniqueName: \"kubernetes.io/projected/d380e047-7297-4835-b948-6c86c6b6aa27-kube-api-access-t6pv7\") pod \"nova-scheduler-0\" (UID: \"d380e047-7297-4835-b948-6c86c6b6aa27\") " pod="openstack/nova-scheduler-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.254908 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e6cd1d9-f670-4e94-8322-44e471c3be71-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.254950 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e6cd1d9-f670-4e94-8322-44e471c3be71-config-data\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.356228 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs59c\" (UniqueName: \"kubernetes.io/projected/4e6cd1d9-f670-4e94-8322-44e471c3be71-kube-api-access-gs59c\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.356323 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d380e047-7297-4835-b948-6c86c6b6aa27-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d380e047-7297-4835-b948-6c86c6b6aa27\") " pod="openstack/nova-scheduler-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.356396 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6pv7\" (UniqueName: \"kubernetes.io/projected/d380e047-7297-4835-b948-6c86c6b6aa27-kube-api-access-t6pv7\") pod \"nova-scheduler-0\" (UID: \"d380e047-7297-4835-b948-6c86c6b6aa27\") " pod="openstack/nova-scheduler-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.356442 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e6cd1d9-f670-4e94-8322-44e471c3be71-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.356473 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e6cd1d9-f670-4e94-8322-44e471c3be71-config-data\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.356499 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e6cd1d9-f670-4e94-8322-44e471c3be71-logs\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.356551 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e6cd1d9-f670-4e94-8322-44e471c3be71-public-tls-certs\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.356596 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d380e047-7297-4835-b948-6c86c6b6aa27-config-data\") pod \"nova-scheduler-0\" (UID: \"d380e047-7297-4835-b948-6c86c6b6aa27\") " pod="openstack/nova-scheduler-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.356652 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e6cd1d9-f670-4e94-8322-44e471c3be71-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.357169 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e6cd1d9-f670-4e94-8322-44e471c3be71-logs\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.361325 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e6cd1d9-f670-4e94-8322-44e471c3be71-public-tls-certs\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.361527 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e6cd1d9-f670-4e94-8322-44e471c3be71-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.362003 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e6cd1d9-f670-4e94-8322-44e471c3be71-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.362558 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d380e047-7297-4835-b948-6c86c6b6aa27-config-data\") pod \"nova-scheduler-0\" (UID: \"d380e047-7297-4835-b948-6c86c6b6aa27\") " pod="openstack/nova-scheduler-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.362657 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e6cd1d9-f670-4e94-8322-44e471c3be71-config-data\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.363026 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d380e047-7297-4835-b948-6c86c6b6aa27-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d380e047-7297-4835-b948-6c86c6b6aa27\") " pod="openstack/nova-scheduler-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.378432 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs59c\" (UniqueName: \"kubernetes.io/projected/4e6cd1d9-f670-4e94-8322-44e471c3be71-kube-api-access-gs59c\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.384891 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6pv7\" (UniqueName: \"kubernetes.io/projected/d380e047-7297-4835-b948-6c86c6b6aa27-kube-api-access-t6pv7\") pod \"nova-scheduler-0\" (UID: \"d380e047-7297-4835-b948-6c86c6b6aa27\") " pod="openstack/nova-scheduler-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.518334 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.527046 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.769982 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7faa5418-aa48-4e20-830c-bb171cfea0d9","Type":"ContainerStarted","Data":"47b96a2a9e2d4fd021ca3db7be839e86e74c808de1dd61390ea0329c3aa36dbb"} Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.770190 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7faa5418-aa48-4e20-830c-bb171cfea0d9","Type":"ContainerStarted","Data":"f8feaf729480571f3bbdf2223f87e36947b32528c83503013718e8240ce9b19e"} Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.797786 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.797767274 podStartE2EDuration="2.797767274s" podCreationTimestamp="2026-03-13 12:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:09:40.789318558 +0000 UTC m=+1296.427585311" watchObservedRunningTime="2026-03-13 12:09:40.797767274 +0000 UTC m=+1296.436034037" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.962591 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.976359 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:09:41 crc kubenswrapper[4837]: I0313 12:09:41.060773 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cc7473d-2608-4989-990f-a19d70e8a3a3" path="/var/lib/kubelet/pods/4cc7473d-2608-4989-990f-a19d70e8a3a3/volumes" Mar 13 12:09:41 crc kubenswrapper[4837]: I0313 12:09:41.061967 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78c9774b-e6ed-434a-9a05-77de64d14c5c" path="/var/lib/kubelet/pods/78c9774b-e6ed-434a-9a05-77de64d14c5c/volumes" Mar 13 12:09:41 crc kubenswrapper[4837]: I0313 12:09:41.780151 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d380e047-7297-4835-b948-6c86c6b6aa27","Type":"ContainerStarted","Data":"1ab5e7d45319b1507059f32e31610ecdcd6883c277c5700b64696c664cfd5b58"} Mar 13 12:09:41 crc kubenswrapper[4837]: I0313 12:09:41.780512 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d380e047-7297-4835-b948-6c86c6b6aa27","Type":"ContainerStarted","Data":"275a8f630bfdc653a0077fb2b60275b114cfeb9c7c779f255ab8db2f4c7baf1d"} Mar 13 12:09:41 crc kubenswrapper[4837]: I0313 12:09:41.782988 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4e6cd1d9-f670-4e94-8322-44e471c3be71","Type":"ContainerStarted","Data":"cc7a15dd902b97c758cb1f742fe8affed99ef0bc2035dc9fea8d357f72b9a616"} Mar 13 12:09:41 crc kubenswrapper[4837]: I0313 12:09:41.783069 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4e6cd1d9-f670-4e94-8322-44e471c3be71","Type":"ContainerStarted","Data":"53969cca8ee0ed37fdb444b1b9c5bf33145fca56378ff3e37a209bebb610a563"} Mar 13 12:09:41 crc kubenswrapper[4837]: I0313 12:09:41.783089 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4e6cd1d9-f670-4e94-8322-44e471c3be71","Type":"ContainerStarted","Data":"302f8203084fe3c1a235a2a972eeb6c0b3fc658aafcc9322a703a34aebd27b45"} Mar 13 12:09:41 crc kubenswrapper[4837]: I0313 12:09:41.808011 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.807989861 podStartE2EDuration="1.807989861s" podCreationTimestamp="2026-03-13 12:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:09:41.801087634 +0000 UTC m=+1297.439354437" watchObservedRunningTime="2026-03-13 12:09:41.807989861 +0000 UTC m=+1297.446256644" Mar 13 12:09:41 crc kubenswrapper[4837]: I0313 12:09:41.839652 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.839603817 podStartE2EDuration="1.839603817s" podCreationTimestamp="2026-03-13 12:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:09:41.835342152 +0000 UTC m=+1297.473608965" watchObservedRunningTime="2026-03-13 12:09:41.839603817 +0000 UTC m=+1297.477870590" Mar 13 12:09:44 crc kubenswrapper[4837]: I0313 12:09:44.091910 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 12:09:44 crc kubenswrapper[4837]: I0313 12:09:44.092186 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 12:09:45 crc kubenswrapper[4837]: I0313 12:09:45.527679 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 12:09:49 crc kubenswrapper[4837]: I0313 12:09:49.091417 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 12:09:49 crc kubenswrapper[4837]: I0313 12:09:49.091756 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 12:09:50 crc kubenswrapper[4837]: I0313 12:09:50.098903 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7faa5418-aa48-4e20-830c-bb171cfea0d9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 12:09:50 crc kubenswrapper[4837]: I0313 12:09:50.104846 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7faa5418-aa48-4e20-830c-bb171cfea0d9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 12:09:50 crc kubenswrapper[4837]: I0313 12:09:50.519341 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 12:09:50 crc kubenswrapper[4837]: I0313 12:09:50.519450 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 12:09:50 crc kubenswrapper[4837]: I0313 12:09:50.527871 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 12:09:50 crc kubenswrapper[4837]: I0313 12:09:50.557808 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 12:09:50 crc kubenswrapper[4837]: I0313 12:09:50.912650 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 12:09:51 crc kubenswrapper[4837]: I0313 12:09:51.536857 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4e6cd1d9-f670-4e94-8322-44e471c3be71" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 12:09:51 crc kubenswrapper[4837]: I0313 12:09:51.537250 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4e6cd1d9-f670-4e94-8322-44e471c3be71" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 12:09:56 crc kubenswrapper[4837]: I0313 12:09:56.323054 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 13 12:09:59 crc kubenswrapper[4837]: I0313 12:09:59.097134 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 12:09:59 crc kubenswrapper[4837]: I0313 12:09:59.098958 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 12:09:59 crc kubenswrapper[4837]: I0313 12:09:59.103472 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 12:09:59 crc kubenswrapper[4837]: I0313 12:09:59.990409 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.152154 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556730-jvprz"] Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.153758 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556730-jvprz" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.155781 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.156087 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.156096 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.164379 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556730-jvprz"] Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.243776 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2j9n\" (UniqueName: \"kubernetes.io/projected/348878ea-aa9f-4306-af10-6a56583447a4-kube-api-access-z2j9n\") pod \"auto-csr-approver-29556730-jvprz\" (UID: \"348878ea-aa9f-4306-af10-6a56583447a4\") " pod="openshift-infra/auto-csr-approver-29556730-jvprz" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.345709 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2j9n\" (UniqueName: \"kubernetes.io/projected/348878ea-aa9f-4306-af10-6a56583447a4-kube-api-access-z2j9n\") pod \"auto-csr-approver-29556730-jvprz\" (UID: \"348878ea-aa9f-4306-af10-6a56583447a4\") " pod="openshift-infra/auto-csr-approver-29556730-jvprz" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.365918 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2j9n\" (UniqueName: \"kubernetes.io/projected/348878ea-aa9f-4306-af10-6a56583447a4-kube-api-access-z2j9n\") pod \"auto-csr-approver-29556730-jvprz\" (UID: \"348878ea-aa9f-4306-af10-6a56583447a4\") " pod="openshift-infra/auto-csr-approver-29556730-jvprz" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.482874 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556730-jvprz" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.536045 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.536815 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.538398 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.581344 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.937333 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556730-jvprz"] Mar 13 12:10:00 crc kubenswrapper[4837]: W0313 12:10:00.942948 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod348878ea_aa9f_4306_af10_6a56583447a4.slice/crio-4d6246100aafc9dcc92b0edfac8408d1bba30aa0d3382d3be1cbdf84969b73b2 WatchSource:0}: Error finding container 4d6246100aafc9dcc92b0edfac8408d1bba30aa0d3382d3be1cbdf84969b73b2: Status 404 returned error can't find the container with id 4d6246100aafc9dcc92b0edfac8408d1bba30aa0d3382d3be1cbdf84969b73b2 Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.991195 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556730-jvprz" event={"ID":"348878ea-aa9f-4306-af10-6a56583447a4","Type":"ContainerStarted","Data":"4d6246100aafc9dcc92b0edfac8408d1bba30aa0d3382d3be1cbdf84969b73b2"} Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.991488 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.997436 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 12:10:03 crc kubenswrapper[4837]: I0313 12:10:03.009474 4837 generic.go:334] "Generic (PLEG): container finished" podID="348878ea-aa9f-4306-af10-6a56583447a4" containerID="400f25fc20473b4a0989af2562c9f1940f8ca26a8e2532da0bcde1d8c359bf39" exitCode=0 Mar 13 12:10:03 crc kubenswrapper[4837]: I0313 12:10:03.009533 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556730-jvprz" event={"ID":"348878ea-aa9f-4306-af10-6a56583447a4","Type":"ContainerDied","Data":"400f25fc20473b4a0989af2562c9f1940f8ca26a8e2532da0bcde1d8c359bf39"} Mar 13 12:10:04 crc kubenswrapper[4837]: I0313 12:10:04.432580 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556730-jvprz" Mar 13 12:10:04 crc kubenswrapper[4837]: I0313 12:10:04.522492 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2j9n\" (UniqueName: \"kubernetes.io/projected/348878ea-aa9f-4306-af10-6a56583447a4-kube-api-access-z2j9n\") pod \"348878ea-aa9f-4306-af10-6a56583447a4\" (UID: \"348878ea-aa9f-4306-af10-6a56583447a4\") " Mar 13 12:10:04 crc kubenswrapper[4837]: I0313 12:10:04.528925 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/348878ea-aa9f-4306-af10-6a56583447a4-kube-api-access-z2j9n" (OuterVolumeSpecName: "kube-api-access-z2j9n") pod "348878ea-aa9f-4306-af10-6a56583447a4" (UID: "348878ea-aa9f-4306-af10-6a56583447a4"). InnerVolumeSpecName "kube-api-access-z2j9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:04 crc kubenswrapper[4837]: I0313 12:10:04.625836 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2j9n\" (UniqueName: \"kubernetes.io/projected/348878ea-aa9f-4306-af10-6a56583447a4-kube-api-access-z2j9n\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:05 crc kubenswrapper[4837]: I0313 12:10:05.029845 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556730-jvprz" event={"ID":"348878ea-aa9f-4306-af10-6a56583447a4","Type":"ContainerDied","Data":"4d6246100aafc9dcc92b0edfac8408d1bba30aa0d3382d3be1cbdf84969b73b2"} Mar 13 12:10:05 crc kubenswrapper[4837]: I0313 12:10:05.029894 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d6246100aafc9dcc92b0edfac8408d1bba30aa0d3382d3be1cbdf84969b73b2" Mar 13 12:10:05 crc kubenswrapper[4837]: I0313 12:10:05.029894 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556730-jvprz" Mar 13 12:10:05 crc kubenswrapper[4837]: I0313 12:10:05.496457 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556724-st6gn"] Mar 13 12:10:05 crc kubenswrapper[4837]: I0313 12:10:05.505828 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556724-st6gn"] Mar 13 12:10:07 crc kubenswrapper[4837]: I0313 12:10:07.060822 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bda3181-d107-4de8-b754-e5e67dd8dd9c" path="/var/lib/kubelet/pods/8bda3181-d107-4de8-b754-e5e67dd8dd9c/volumes" Mar 13 12:10:08 crc kubenswrapper[4837]: I0313 12:10:08.815486 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 12:10:09 crc kubenswrapper[4837]: I0313 12:10:09.517452 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 12:10:13 crc kubenswrapper[4837]: I0313 12:10:13.268823 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="e7b01be4-73b6-48eb-a06d-4fb38863d982" containerName="rabbitmq" containerID="cri-o://616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c" gracePeriod=604796 Mar 13 12:10:13 crc kubenswrapper[4837]: I0313 12:10:13.444344 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="13254c8b-516c-435e-9db2-a8d518434f29" containerName="rabbitmq" containerID="cri-o://0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1" gracePeriod=604797 Mar 13 12:10:13 crc kubenswrapper[4837]: I0313 12:10:13.454712 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="e7b01be4-73b6-48eb-a06d-4fb38863d982" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Mar 13 12:10:19 crc kubenswrapper[4837]: I0313 12:10:19.872027 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 12:10:19 crc kubenswrapper[4837]: I0313 12:10:19.983684 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.037477 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl7pd\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-kube-api-access-pl7pd\") pod \"e7b01be4-73b6-48eb-a06d-4fb38863d982\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.037605 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-plugins-conf\") pod \"e7b01be4-73b6-48eb-a06d-4fb38863d982\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.037694 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e7b01be4-73b6-48eb-a06d-4fb38863d982-erlang-cookie-secret\") pod \"e7b01be4-73b6-48eb-a06d-4fb38863d982\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.037758 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-config-data\") pod \"e7b01be4-73b6-48eb-a06d-4fb38863d982\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.037810 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-confd\") pod \"e7b01be4-73b6-48eb-a06d-4fb38863d982\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.037877 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-server-conf\") pod \"e7b01be4-73b6-48eb-a06d-4fb38863d982\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.037978 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-plugins\") pod \"e7b01be4-73b6-48eb-a06d-4fb38863d982\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.038009 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-erlang-cookie\") pod \"e7b01be4-73b6-48eb-a06d-4fb38863d982\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.038095 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e7b01be4-73b6-48eb-a06d-4fb38863d982-pod-info\") pod \"e7b01be4-73b6-48eb-a06d-4fb38863d982\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.038142 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"e7b01be4-73b6-48eb-a06d-4fb38863d982\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.038175 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-tls\") pod \"e7b01be4-73b6-48eb-a06d-4fb38863d982\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.044838 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e7b01be4-73b6-48eb-a06d-4fb38863d982" (UID: "e7b01be4-73b6-48eb-a06d-4fb38863d982"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.045482 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e7b01be4-73b6-48eb-a06d-4fb38863d982" (UID: "e7b01be4-73b6-48eb-a06d-4fb38863d982"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.045598 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "e7b01be4-73b6-48eb-a06d-4fb38863d982" (UID: "e7b01be4-73b6-48eb-a06d-4fb38863d982"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.045854 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b01be4-73b6-48eb-a06d-4fb38863d982-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e7b01be4-73b6-48eb-a06d-4fb38863d982" (UID: "e7b01be4-73b6-48eb-a06d-4fb38863d982"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.046057 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e7b01be4-73b6-48eb-a06d-4fb38863d982-pod-info" (OuterVolumeSpecName: "pod-info") pod "e7b01be4-73b6-48eb-a06d-4fb38863d982" (UID: "e7b01be4-73b6-48eb-a06d-4fb38863d982"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.051022 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e7b01be4-73b6-48eb-a06d-4fb38863d982" (UID: "e7b01be4-73b6-48eb-a06d-4fb38863d982"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.053391 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-kube-api-access-pl7pd" (OuterVolumeSpecName: "kube-api-access-pl7pd") pod "e7b01be4-73b6-48eb-a06d-4fb38863d982" (UID: "e7b01be4-73b6-48eb-a06d-4fb38863d982"). InnerVolumeSpecName "kube-api-access-pl7pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.055929 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e7b01be4-73b6-48eb-a06d-4fb38863d982" (UID: "e7b01be4-73b6-48eb-a06d-4fb38863d982"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.102255 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-server-conf" (OuterVolumeSpecName: "server-conf") pod "e7b01be4-73b6-48eb-a06d-4fb38863d982" (UID: "e7b01be4-73b6-48eb-a06d-4fb38863d982"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.112212 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-config-data" (OuterVolumeSpecName: "config-data") pod "e7b01be4-73b6-48eb-a06d-4fb38863d982" (UID: "e7b01be4-73b6-48eb-a06d-4fb38863d982"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.140410 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13254c8b-516c-435e-9db2-a8d518434f29-pod-info\") pod \"13254c8b-516c-435e-9db2-a8d518434f29\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.140474 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-config-data\") pod \"13254c8b-516c-435e-9db2-a8d518434f29\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.140914 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-server-conf\") pod \"13254c8b-516c-435e-9db2-a8d518434f29\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.141027 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-plugins-conf\") pod \"13254c8b-516c-435e-9db2-a8d518434f29\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.141119 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"13254c8b-516c-435e-9db2-a8d518434f29\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.141156 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-erlang-cookie\") pod \"13254c8b-516c-435e-9db2-a8d518434f29\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.141183 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfz87\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-kube-api-access-wfz87\") pod \"13254c8b-516c-435e-9db2-a8d518434f29\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.141215 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-confd\") pod \"13254c8b-516c-435e-9db2-a8d518434f29\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.141244 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13254c8b-516c-435e-9db2-a8d518434f29-erlang-cookie-secret\") pod \"13254c8b-516c-435e-9db2-a8d518434f29\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.141300 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-tls\") pod \"13254c8b-516c-435e-9db2-a8d518434f29\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.141319 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-plugins\") pod \"13254c8b-516c-435e-9db2-a8d518434f29\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.141855 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "13254c8b-516c-435e-9db2-a8d518434f29" (UID: "13254c8b-516c-435e-9db2-a8d518434f29"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.142630 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "13254c8b-516c-435e-9db2-a8d518434f29" (UID: "13254c8b-516c-435e-9db2-a8d518434f29"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145001 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145035 4837 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145166 4837 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-server-conf\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145185 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145199 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145197 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "13254c8b-516c-435e-9db2-a8d518434f29" (UID: "13254c8b-516c-435e-9db2-a8d518434f29"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145210 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145227 4837 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e7b01be4-73b6-48eb-a06d-4fb38863d982-pod-info\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145256 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145278 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145294 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl7pd\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-kube-api-access-pl7pd\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145305 4837 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145317 4837 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e7b01be4-73b6-48eb-a06d-4fb38863d982-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.146921 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "13254c8b-516c-435e-9db2-a8d518434f29" (UID: "13254c8b-516c-435e-9db2-a8d518434f29"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.147443 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "13254c8b-516c-435e-9db2-a8d518434f29" (UID: "13254c8b-516c-435e-9db2-a8d518434f29"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.148526 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13254c8b-516c-435e-9db2-a8d518434f29-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "13254c8b-516c-435e-9db2-a8d518434f29" (UID: "13254c8b-516c-435e-9db2-a8d518434f29"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.153329 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e7b01be4-73b6-48eb-a06d-4fb38863d982" (UID: "e7b01be4-73b6-48eb-a06d-4fb38863d982"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.181403 4837 generic.go:334] "Generic (PLEG): container finished" podID="e7b01be4-73b6-48eb-a06d-4fb38863d982" containerID="616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c" exitCode=0 Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.181490 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e7b01be4-73b6-48eb-a06d-4fb38863d982","Type":"ContainerDied","Data":"616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c"} Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.181517 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e7b01be4-73b6-48eb-a06d-4fb38863d982","Type":"ContainerDied","Data":"c8180a84e0af5653dde0ac3c7b4b0a9aa55749048023693725605b5733ff15c7"} Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.181534 4837 scope.go:117] "RemoveContainer" containerID="616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.181701 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.190136 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"13254c8b-516c-435e-9db2-a8d518434f29","Type":"ContainerDied","Data":"0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1"} Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.190091 4837 generic.go:334] "Generic (PLEG): container finished" podID="13254c8b-516c-435e-9db2-a8d518434f29" containerID="0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1" exitCode=0 Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.190325 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"13254c8b-516c-435e-9db2-a8d518434f29","Type":"ContainerDied","Data":"5d7d6eb76793e1d7753bbea8ba4648a937e2549b33bfd032cf29e8f6d1e62f4c"} Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.190346 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.198150 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/13254c8b-516c-435e-9db2-a8d518434f29-pod-info" (OuterVolumeSpecName: "pod-info") pod "13254c8b-516c-435e-9db2-a8d518434f29" (UID: "13254c8b-516c-435e-9db2-a8d518434f29"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.200775 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-kube-api-access-wfz87" (OuterVolumeSpecName: "kube-api-access-wfz87") pod "13254c8b-516c-435e-9db2-a8d518434f29" (UID: "13254c8b-516c-435e-9db2-a8d518434f29"). InnerVolumeSpecName "kube-api-access-wfz87". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.213555 4837 scope.go:117] "RemoveContainer" containerID="afe3a88a0e8205fefe122a8099e4acc29a3ebc22c1a9a9cfe3c00f5ab1794007" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.224474 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-config-data" (OuterVolumeSpecName: "config-data") pod "13254c8b-516c-435e-9db2-a8d518434f29" (UID: "13254c8b-516c-435e-9db2-a8d518434f29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.238095 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.250500 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.250554 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.250565 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfz87\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-kube-api-access-wfz87\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.250575 4837 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13254c8b-516c-435e-9db2-a8d518434f29-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.250584 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.250592 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.250601 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.250608 4837 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13254c8b-516c-435e-9db2-a8d518434f29-pod-info\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.250616 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.256761 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.267317 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-server-conf" (OuterVolumeSpecName: "server-conf") pod "13254c8b-516c-435e-9db2-a8d518434f29" (UID: "13254c8b-516c-435e-9db2-a8d518434f29"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.267722 4837 scope.go:117] "RemoveContainer" containerID="616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c" Mar 13 12:10:20 crc kubenswrapper[4837]: E0313 12:10:20.273034 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c\": container with ID starting with 616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c not found: ID does not exist" containerID="616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.273084 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c"} err="failed to get container status \"616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c\": rpc error: code = NotFound desc = could not find container \"616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c\": container with ID starting with 616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c not found: ID does not exist" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.273109 4837 scope.go:117] "RemoveContainer" containerID="afe3a88a0e8205fefe122a8099e4acc29a3ebc22c1a9a9cfe3c00f5ab1794007" Mar 13 12:10:20 crc kubenswrapper[4837]: E0313 12:10:20.273576 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afe3a88a0e8205fefe122a8099e4acc29a3ebc22c1a9a9cfe3c00f5ab1794007\": container with ID starting with afe3a88a0e8205fefe122a8099e4acc29a3ebc22c1a9a9cfe3c00f5ab1794007 not found: ID does not exist" containerID="afe3a88a0e8205fefe122a8099e4acc29a3ebc22c1a9a9cfe3c00f5ab1794007" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.273719 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afe3a88a0e8205fefe122a8099e4acc29a3ebc22c1a9a9cfe3c00f5ab1794007"} err="failed to get container status \"afe3a88a0e8205fefe122a8099e4acc29a3ebc22c1a9a9cfe3c00f5ab1794007\": rpc error: code = NotFound desc = could not find container \"afe3a88a0e8205fefe122a8099e4acc29a3ebc22c1a9a9cfe3c00f5ab1794007\": container with ID starting with afe3a88a0e8205fefe122a8099e4acc29a3ebc22c1a9a9cfe3c00f5ab1794007 not found: ID does not exist" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.273822 4837 scope.go:117] "RemoveContainer" containerID="0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.296791 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.303969 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 12:10:20 crc kubenswrapper[4837]: E0313 12:10:20.307442 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="348878ea-aa9f-4306-af10-6a56583447a4" containerName="oc" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.307600 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="348878ea-aa9f-4306-af10-6a56583447a4" containerName="oc" Mar 13 12:10:20 crc kubenswrapper[4837]: E0313 12:10:20.307956 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b01be4-73b6-48eb-a06d-4fb38863d982" containerName="rabbitmq" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.308173 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b01be4-73b6-48eb-a06d-4fb38863d982" containerName="rabbitmq" Mar 13 12:10:20 crc kubenswrapper[4837]: E0313 12:10:20.308291 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b01be4-73b6-48eb-a06d-4fb38863d982" containerName="setup-container" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.308363 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b01be4-73b6-48eb-a06d-4fb38863d982" containerName="setup-container" Mar 13 12:10:20 crc kubenswrapper[4837]: E0313 12:10:20.308439 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13254c8b-516c-435e-9db2-a8d518434f29" containerName="rabbitmq" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.308503 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="13254c8b-516c-435e-9db2-a8d518434f29" containerName="rabbitmq" Mar 13 12:10:20 crc kubenswrapper[4837]: E0313 12:10:20.308584 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13254c8b-516c-435e-9db2-a8d518434f29" containerName="setup-container" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.308696 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="13254c8b-516c-435e-9db2-a8d518434f29" containerName="setup-container" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.309212 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="13254c8b-516c-435e-9db2-a8d518434f29" containerName="rabbitmq" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.309347 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b01be4-73b6-48eb-a06d-4fb38863d982" containerName="rabbitmq" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.309522 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="348878ea-aa9f-4306-af10-6a56583447a4" containerName="oc" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.310827 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.313142 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.317531 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.317632 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.317825 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.318373 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8bxdt" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.318443 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.318449 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.318751 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.332955 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.351577 4837 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-server-conf\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.351611 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.352259 4837 scope.go:117] "RemoveContainer" containerID="d7e3a2439b933c4a76e0a0472aaff3cf352a36e55d6c5a3aa674478c5299b9be" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.384487 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "13254c8b-516c-435e-9db2-a8d518434f29" (UID: "13254c8b-516c-435e-9db2-a8d518434f29"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.384670 4837 scope.go:117] "RemoveContainer" containerID="0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1" Mar 13 12:10:20 crc kubenswrapper[4837]: E0313 12:10:20.385167 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1\": container with ID starting with 0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1 not found: ID does not exist" containerID="0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.385212 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1"} err="failed to get container status \"0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1\": rpc error: code = NotFound desc = could not find container \"0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1\": container with ID starting with 0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1 not found: ID does not exist" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.385275 4837 scope.go:117] "RemoveContainer" containerID="d7e3a2439b933c4a76e0a0472aaff3cf352a36e55d6c5a3aa674478c5299b9be" Mar 13 12:10:20 crc kubenswrapper[4837]: E0313 12:10:20.385776 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7e3a2439b933c4a76e0a0472aaff3cf352a36e55d6c5a3aa674478c5299b9be\": container with ID starting with d7e3a2439b933c4a76e0a0472aaff3cf352a36e55d6c5a3aa674478c5299b9be not found: ID does not exist" containerID="d7e3a2439b933c4a76e0a0472aaff3cf352a36e55d6c5a3aa674478c5299b9be" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.385805 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7e3a2439b933c4a76e0a0472aaff3cf352a36e55d6c5a3aa674478c5299b9be"} err="failed to get container status \"d7e3a2439b933c4a76e0a0472aaff3cf352a36e55d6c5a3aa674478c5299b9be\": rpc error: code = NotFound desc = could not find container \"d7e3a2439b933c4a76e0a0472aaff3cf352a36e55d6c5a3aa674478c5299b9be\": container with ID starting with d7e3a2439b933c4a76e0a0472aaff3cf352a36e55d6c5a3aa674478c5299b9be not found: ID does not exist" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.453448 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/245e5a26-d143-4e4d-bae8-094275a91574-pod-info\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.453544 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/245e5a26-d143-4e4d-bae8-094275a91574-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.453564 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/245e5a26-d143-4e4d-bae8-094275a91574-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.453583 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/245e5a26-d143-4e4d-bae8-094275a91574-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.453711 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/245e5a26-d143-4e4d-bae8-094275a91574-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.453746 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/245e5a26-d143-4e4d-bae8-094275a91574-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.453772 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvf5h\" (UniqueName: \"kubernetes.io/projected/245e5a26-d143-4e4d-bae8-094275a91574-kube-api-access-jvf5h\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.453847 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/245e5a26-d143-4e4d-bae8-094275a91574-config-data\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.453905 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.453926 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/245e5a26-d143-4e4d-bae8-094275a91574-server-conf\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.453972 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/245e5a26-d143-4e4d-bae8-094275a91574-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.454087 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.557325 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/245e5a26-d143-4e4d-bae8-094275a91574-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.557457 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/245e5a26-d143-4e4d-bae8-094275a91574-pod-info\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.557547 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/245e5a26-d143-4e4d-bae8-094275a91574-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.557570 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/245e5a26-d143-4e4d-bae8-094275a91574-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.557602 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/245e5a26-d143-4e4d-bae8-094275a91574-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.557694 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/245e5a26-d143-4e4d-bae8-094275a91574-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.557772 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/245e5a26-d143-4e4d-bae8-094275a91574-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.557795 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvf5h\" (UniqueName: \"kubernetes.io/projected/245e5a26-d143-4e4d-bae8-094275a91574-kube-api-access-jvf5h\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.557826 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/245e5a26-d143-4e4d-bae8-094275a91574-config-data\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.557879 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.557902 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/245e5a26-d143-4e4d-bae8-094275a91574-server-conf\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.559273 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/245e5a26-d143-4e4d-bae8-094275a91574-server-conf\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.560275 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/245e5a26-d143-4e4d-bae8-094275a91574-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.560430 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/245e5a26-d143-4e4d-bae8-094275a91574-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.561045 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/245e5a26-d143-4e4d-bae8-094275a91574-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.561057 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.562605 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/245e5a26-d143-4e4d-bae8-094275a91574-config-data\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.564265 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/245e5a26-d143-4e4d-bae8-094275a91574-pod-info\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.564816 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/245e5a26-d143-4e4d-bae8-094275a91574-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.564944 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/245e5a26-d143-4e4d-bae8-094275a91574-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.571087 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/245e5a26-d143-4e4d-bae8-094275a91574-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.587455 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvf5h\" (UniqueName: \"kubernetes.io/projected/245e5a26-d143-4e4d-bae8-094275a91574-kube-api-access-jvf5h\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.591560 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.605286 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.616864 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.619016 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.624236 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mb2tp" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.624544 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.624570 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.624650 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.624751 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.624822 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.625001 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.635041 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.639103 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.658500 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.761100 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p57rp\" (UniqueName: \"kubernetes.io/projected/90028d66-5134-4c09-af15-71e754f49bf3-kube-api-access-p57rp\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.761182 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/90028d66-5134-4c09-af15-71e754f49bf3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.761209 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/90028d66-5134-4c09-af15-71e754f49bf3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.761231 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/90028d66-5134-4c09-af15-71e754f49bf3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.761277 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/90028d66-5134-4c09-af15-71e754f49bf3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.761325 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/90028d66-5134-4c09-af15-71e754f49bf3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.761355 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90028d66-5134-4c09-af15-71e754f49bf3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.761374 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/90028d66-5134-4c09-af15-71e754f49bf3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.761404 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.761423 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/90028d66-5134-4c09-af15-71e754f49bf3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.761443 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/90028d66-5134-4c09-af15-71e754f49bf3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.863650 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/90028d66-5134-4c09-af15-71e754f49bf3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.863715 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/90028d66-5134-4c09-af15-71e754f49bf3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.863740 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/90028d66-5134-4c09-af15-71e754f49bf3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.863791 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/90028d66-5134-4c09-af15-71e754f49bf3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.863841 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/90028d66-5134-4c09-af15-71e754f49bf3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.863868 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90028d66-5134-4c09-af15-71e754f49bf3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.863891 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/90028d66-5134-4c09-af15-71e754f49bf3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.863930 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.863957 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/90028d66-5134-4c09-af15-71e754f49bf3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.863982 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/90028d66-5134-4c09-af15-71e754f49bf3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.864052 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p57rp\" (UniqueName: \"kubernetes.io/projected/90028d66-5134-4c09-af15-71e754f49bf3-kube-api-access-p57rp\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.865614 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/90028d66-5134-4c09-af15-71e754f49bf3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.865765 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.868873 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/90028d66-5134-4c09-af15-71e754f49bf3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.869571 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/90028d66-5134-4c09-af15-71e754f49bf3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.869955 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/90028d66-5134-4c09-af15-71e754f49bf3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.870201 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/90028d66-5134-4c09-af15-71e754f49bf3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.871274 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/90028d66-5134-4c09-af15-71e754f49bf3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.873117 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90028d66-5134-4c09-af15-71e754f49bf3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.874789 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/90028d66-5134-4c09-af15-71e754f49bf3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.875077 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/90028d66-5134-4c09-af15-71e754f49bf3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.882915 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p57rp\" (UniqueName: \"kubernetes.io/projected/90028d66-5134-4c09-af15-71e754f49bf3-kube-api-access-p57rp\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.900798 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.058707 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13254c8b-516c-435e-9db2-a8d518434f29" path="/var/lib/kubelet/pods/13254c8b-516c-435e-9db2-a8d518434f29/volumes" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.059717 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7b01be4-73b6-48eb-a06d-4fb38863d982" path="/var/lib/kubelet/pods/e7b01be4-73b6-48eb-a06d-4fb38863d982/volumes" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.109331 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.140380 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.216867 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"245e5a26-d143-4e4d-bae8-094275a91574","Type":"ContainerStarted","Data":"669ae6a1c242dd485f718081cac25de23c2b27fae539b95b098d625b50bfde0c"} Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.576261 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.707214 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-smx7d"] Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.708921 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.711491 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.721197 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-smx7d"] Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.891579 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.891669 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.891776 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkhxz\" (UniqueName: \"kubernetes.io/projected/8561b7f2-0c2e-44bc-8f9f-be22c9624182-kube-api-access-bkhxz\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.891809 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.891828 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-config\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.891855 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.891874 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.993953 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.994126 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkhxz\" (UniqueName: \"kubernetes.io/projected/8561b7f2-0c2e-44bc-8f9f-be22c9624182-kube-api-access-bkhxz\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.994172 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.994201 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-config\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.994240 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.994261 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.994304 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.995207 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.995210 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.995379 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-config\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.995554 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.995710 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.996095 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:22 crc kubenswrapper[4837]: I0313 12:10:22.016379 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkhxz\" (UniqueName: \"kubernetes.io/projected/8561b7f2-0c2e-44bc-8f9f-be22c9624182-kube-api-access-bkhxz\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:22 crc kubenswrapper[4837]: I0313 12:10:22.068252 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:22 crc kubenswrapper[4837]: I0313 12:10:22.245095 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"90028d66-5134-4c09-af15-71e754f49bf3","Type":"ContainerStarted","Data":"bd5a45941299ca017c1df94156bbe52bd40d508730d729dff288c7e63fa8ac1a"} Mar 13 12:10:22 crc kubenswrapper[4837]: I0313 12:10:22.527161 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-smx7d"] Mar 13 12:10:23 crc kubenswrapper[4837]: I0313 12:10:23.255390 4837 generic.go:334] "Generic (PLEG): container finished" podID="8561b7f2-0c2e-44bc-8f9f-be22c9624182" containerID="7e7bf0350454cdf6a55ff53b12033e233fa5322f6b553c3d41b099505b8fd05e" exitCode=0 Mar 13 12:10:23 crc kubenswrapper[4837]: I0313 12:10:23.255463 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" event={"ID":"8561b7f2-0c2e-44bc-8f9f-be22c9624182","Type":"ContainerDied","Data":"7e7bf0350454cdf6a55ff53b12033e233fa5322f6b553c3d41b099505b8fd05e"} Mar 13 12:10:23 crc kubenswrapper[4837]: I0313 12:10:23.255742 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" event={"ID":"8561b7f2-0c2e-44bc-8f9f-be22c9624182","Type":"ContainerStarted","Data":"b996ce339d9115538429d8844c7070a737c07d9e64960dc8589245407a365c7b"} Mar 13 12:10:23 crc kubenswrapper[4837]: I0313 12:10:23.257431 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"245e5a26-d143-4e4d-bae8-094275a91574","Type":"ContainerStarted","Data":"968024f3e7a34fb2251686b56e03a3d25328f059240b5fd48e62fa0112da765c"} Mar 13 12:10:24 crc kubenswrapper[4837]: I0313 12:10:24.271214 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" event={"ID":"8561b7f2-0c2e-44bc-8f9f-be22c9624182","Type":"ContainerStarted","Data":"36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1"} Mar 13 12:10:24 crc kubenswrapper[4837]: I0313 12:10:24.271917 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:24 crc kubenswrapper[4837]: I0313 12:10:24.274522 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"90028d66-5134-4c09-af15-71e754f49bf3","Type":"ContainerStarted","Data":"3707d1215c01b01e085f415765474c7f7c0f6cccb71ba878a0cb6e1bc6e40be6"} Mar 13 12:10:24 crc kubenswrapper[4837]: I0313 12:10:24.297827 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" podStartSLOduration=3.297807042 podStartE2EDuration="3.297807042s" podCreationTimestamp="2026-03-13 12:10:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:24.286674592 +0000 UTC m=+1339.924941365" watchObservedRunningTime="2026-03-13 12:10:24.297807042 +0000 UTC m=+1339.936073805" Mar 13 12:10:27 crc kubenswrapper[4837]: I0313 12:10:27.235144 4837 scope.go:117] "RemoveContainer" containerID="945088ee0e42cd72cf70828366cf9ffb988a0eebcb4e0d5222d7e3f1439eeef4" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.070095 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.125977 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-ql9zn"] Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.126216 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" podUID="6d9c85e6-5c66-4c94-996b-0278453fd29c" containerName="dnsmasq-dns" containerID="cri-o://daf8bcea9fd0562663127a8a93a369152f67e1407f2a8bee704558594419d5d6" gracePeriod=10 Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.297549 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55478c4467-bxc2t"] Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.299469 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.312478 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-bxc2t"] Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.372553 4837 generic.go:334] "Generic (PLEG): container finished" podID="6d9c85e6-5c66-4c94-996b-0278453fd29c" containerID="daf8bcea9fd0562663127a8a93a369152f67e1407f2a8bee704558594419d5d6" exitCode=0 Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.372610 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" event={"ID":"6d9c85e6-5c66-4c94-996b-0278453fd29c","Type":"ContainerDied","Data":"daf8bcea9fd0562663127a8a93a369152f67e1407f2a8bee704558594419d5d6"} Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.402163 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.402208 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.402252 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-dns-svc\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.402364 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.402411 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqtvc\" (UniqueName: \"kubernetes.io/projected/98f4bdc5-6452-4630-a299-6234d8a63bf8-kube-api-access-gqtvc\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.402465 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-config\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.402523 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.504268 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-config\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.504342 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.504414 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.504435 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.504477 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-dns-svc\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.504565 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.504606 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqtvc\" (UniqueName: \"kubernetes.io/projected/98f4bdc5-6452-4630-a299-6234d8a63bf8-kube-api-access-gqtvc\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.505360 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-config\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.505393 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.505541 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-dns-svc\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.505562 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.506053 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.506198 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.530882 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqtvc\" (UniqueName: \"kubernetes.io/projected/98f4bdc5-6452-4630-a299-6234d8a63bf8-kube-api-access-gqtvc\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.633687 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.771064 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.912716 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-ovsdbserver-sb\") pod \"6d9c85e6-5c66-4c94-996b-0278453fd29c\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.912758 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-dns-swift-storage-0\") pod \"6d9c85e6-5c66-4c94-996b-0278453fd29c\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.912798 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdwv8\" (UniqueName: \"kubernetes.io/projected/6d9c85e6-5c66-4c94-996b-0278453fd29c-kube-api-access-zdwv8\") pod \"6d9c85e6-5c66-4c94-996b-0278453fd29c\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.912878 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-dns-svc\") pod \"6d9c85e6-5c66-4c94-996b-0278453fd29c\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.912927 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-ovsdbserver-nb\") pod \"6d9c85e6-5c66-4c94-996b-0278453fd29c\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.913089 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-config\") pod \"6d9c85e6-5c66-4c94-996b-0278453fd29c\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.918885 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d9c85e6-5c66-4c94-996b-0278453fd29c-kube-api-access-zdwv8" (OuterVolumeSpecName: "kube-api-access-zdwv8") pod "6d9c85e6-5c66-4c94-996b-0278453fd29c" (UID: "6d9c85e6-5c66-4c94-996b-0278453fd29c"). InnerVolumeSpecName "kube-api-access-zdwv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.969235 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-config" (OuterVolumeSpecName: "config") pod "6d9c85e6-5c66-4c94-996b-0278453fd29c" (UID: "6d9c85e6-5c66-4c94-996b-0278453fd29c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.973061 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6d9c85e6-5c66-4c94-996b-0278453fd29c" (UID: "6d9c85e6-5c66-4c94-996b-0278453fd29c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.973594 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6d9c85e6-5c66-4c94-996b-0278453fd29c" (UID: "6d9c85e6-5c66-4c94-996b-0278453fd29c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.974002 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6d9c85e6-5c66-4c94-996b-0278453fd29c" (UID: "6d9c85e6-5c66-4c94-996b-0278453fd29c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.981162 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6d9c85e6-5c66-4c94-996b-0278453fd29c" (UID: "6d9c85e6-5c66-4c94-996b-0278453fd29c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.015609 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.015660 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.015672 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.015683 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.015696 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.015707 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdwv8\" (UniqueName: \"kubernetes.io/projected/6d9c85e6-5c66-4c94-996b-0278453fd29c-kube-api-access-zdwv8\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.100057 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-bxc2t"] Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.382133 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" event={"ID":"6d9c85e6-5c66-4c94-996b-0278453fd29c","Type":"ContainerDied","Data":"b047a2dec8a72a4326c0ddd3270cd5183bb922bd572e2b9241c600e148c1eea7"} Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.382434 4837 scope.go:117] "RemoveContainer" containerID="daf8bcea9fd0562663127a8a93a369152f67e1407f2a8bee704558594419d5d6" Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.382279 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.383997 4837 generic.go:334] "Generic (PLEG): container finished" podID="98f4bdc5-6452-4630-a299-6234d8a63bf8" containerID="def881830f11c938b1f9f72bad0579bbe4bed6b4616080645ffbfc4ade37dd2a" exitCode=0 Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.384059 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-bxc2t" event={"ID":"98f4bdc5-6452-4630-a299-6234d8a63bf8","Type":"ContainerDied","Data":"def881830f11c938b1f9f72bad0579bbe4bed6b4616080645ffbfc4ade37dd2a"} Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.384081 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-bxc2t" event={"ID":"98f4bdc5-6452-4630-a299-6234d8a63bf8","Type":"ContainerStarted","Data":"9df755008f8ca09ac808d1f5e7998ca4a4d73b0d2045552cf2e9dc79be637045"} Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.573458 4837 scope.go:117] "RemoveContainer" containerID="0ac8018727334fad931d8e9b782b5ff6d28c6c9743c0f7da2e79336a427ee5cf" Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.598081 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-ql9zn"] Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.609660 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-ql9zn"] Mar 13 12:10:34 crc kubenswrapper[4837]: I0313 12:10:34.393885 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-bxc2t" event={"ID":"98f4bdc5-6452-4630-a299-6234d8a63bf8","Type":"ContainerStarted","Data":"7c3bcb357ed3d7475a2ec4327761fa5510a02f8b713eea793ac3ea35316b3c01"} Mar 13 12:10:34 crc kubenswrapper[4837]: I0313 12:10:34.394227 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:34 crc kubenswrapper[4837]: I0313 12:10:34.416782 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55478c4467-bxc2t" podStartSLOduration=2.416765755 podStartE2EDuration="2.416765755s" podCreationTimestamp="2026-03-13 12:10:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:34.413054958 +0000 UTC m=+1350.051321741" watchObservedRunningTime="2026-03-13 12:10:34.416765755 +0000 UTC m=+1350.055032508" Mar 13 12:10:35 crc kubenswrapper[4837]: I0313 12:10:35.057798 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d9c85e6-5c66-4c94-996b-0278453fd29c" path="/var/lib/kubelet/pods/6d9c85e6-5c66-4c94-996b-0278453fd29c/volumes" Mar 13 12:10:35 crc kubenswrapper[4837]: I0313 12:10:35.484180 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:10:35 crc kubenswrapper[4837]: I0313 12:10:35.484250 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:10:42 crc kubenswrapper[4837]: I0313 12:10:42.635505 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:42 crc kubenswrapper[4837]: I0313 12:10:42.704048 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-smx7d"] Mar 13 12:10:42 crc kubenswrapper[4837]: I0313 12:10:42.704940 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" podUID="8561b7f2-0c2e-44bc-8f9f-be22c9624182" containerName="dnsmasq-dns" containerID="cri-o://36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1" gracePeriod=10 Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.182026 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.310949 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-ovsdbserver-sb\") pod \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.311122 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-dns-svc\") pod \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.311213 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-dns-swift-storage-0\") pod \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.311228 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-openstack-edpm-ipam\") pod \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.311264 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkhxz\" (UniqueName: \"kubernetes.io/projected/8561b7f2-0c2e-44bc-8f9f-be22c9624182-kube-api-access-bkhxz\") pod \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.311280 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-ovsdbserver-nb\") pod \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.311347 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-config\") pod \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.323097 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8561b7f2-0c2e-44bc-8f9f-be22c9624182-kube-api-access-bkhxz" (OuterVolumeSpecName: "kube-api-access-bkhxz") pod "8561b7f2-0c2e-44bc-8f9f-be22c9624182" (UID: "8561b7f2-0c2e-44bc-8f9f-be22c9624182"). InnerVolumeSpecName "kube-api-access-bkhxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.362855 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8561b7f2-0c2e-44bc-8f9f-be22c9624182" (UID: "8561b7f2-0c2e-44bc-8f9f-be22c9624182"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.372950 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8561b7f2-0c2e-44bc-8f9f-be22c9624182" (UID: "8561b7f2-0c2e-44bc-8f9f-be22c9624182"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.374143 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-config" (OuterVolumeSpecName: "config") pod "8561b7f2-0c2e-44bc-8f9f-be22c9624182" (UID: "8561b7f2-0c2e-44bc-8f9f-be22c9624182"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.385913 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8561b7f2-0c2e-44bc-8f9f-be22c9624182" (UID: "8561b7f2-0c2e-44bc-8f9f-be22c9624182"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.386777 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8561b7f2-0c2e-44bc-8f9f-be22c9624182" (UID: "8561b7f2-0c2e-44bc-8f9f-be22c9624182"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.391971 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "8561b7f2-0c2e-44bc-8f9f-be22c9624182" (UID: "8561b7f2-0c2e-44bc-8f9f-be22c9624182"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.414345 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.414391 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.414408 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.414421 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkhxz\" (UniqueName: \"kubernetes.io/projected/8561b7f2-0c2e-44bc-8f9f-be22c9624182-kube-api-access-bkhxz\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.414433 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.414445 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.414459 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.472445 4837 generic.go:334] "Generic (PLEG): container finished" podID="8561b7f2-0c2e-44bc-8f9f-be22c9624182" containerID="36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1" exitCode=0 Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.472490 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" event={"ID":"8561b7f2-0c2e-44bc-8f9f-be22c9624182","Type":"ContainerDied","Data":"36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1"} Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.472523 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" event={"ID":"8561b7f2-0c2e-44bc-8f9f-be22c9624182","Type":"ContainerDied","Data":"b996ce339d9115538429d8844c7070a737c07d9e64960dc8589245407a365c7b"} Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.472548 4837 scope.go:117] "RemoveContainer" containerID="36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.472585 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.493679 4837 scope.go:117] "RemoveContainer" containerID="7e7bf0350454cdf6a55ff53b12033e233fa5322f6b553c3d41b099505b8fd05e" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.509303 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-smx7d"] Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.519043 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-smx7d"] Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.531768 4837 scope.go:117] "RemoveContainer" containerID="36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1" Mar 13 12:10:43 crc kubenswrapper[4837]: E0313 12:10:43.532200 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1\": container with ID starting with 36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1 not found: ID does not exist" containerID="36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.532241 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1"} err="failed to get container status \"36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1\": rpc error: code = NotFound desc = could not find container \"36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1\": container with ID starting with 36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1 not found: ID does not exist" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.532266 4837 scope.go:117] "RemoveContainer" containerID="7e7bf0350454cdf6a55ff53b12033e233fa5322f6b553c3d41b099505b8fd05e" Mar 13 12:10:43 crc kubenswrapper[4837]: E0313 12:10:43.532490 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e7bf0350454cdf6a55ff53b12033e233fa5322f6b553c3d41b099505b8fd05e\": container with ID starting with 7e7bf0350454cdf6a55ff53b12033e233fa5322f6b553c3d41b099505b8fd05e not found: ID does not exist" containerID="7e7bf0350454cdf6a55ff53b12033e233fa5322f6b553c3d41b099505b8fd05e" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.532521 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e7bf0350454cdf6a55ff53b12033e233fa5322f6b553c3d41b099505b8fd05e"} err="failed to get container status \"7e7bf0350454cdf6a55ff53b12033e233fa5322f6b553c3d41b099505b8fd05e\": rpc error: code = NotFound desc = could not find container \"7e7bf0350454cdf6a55ff53b12033e233fa5322f6b553c3d41b099505b8fd05e\": container with ID starting with 7e7bf0350454cdf6a55ff53b12033e233fa5322f6b553c3d41b099505b8fd05e not found: ID does not exist" Mar 13 12:10:45 crc kubenswrapper[4837]: I0313 12:10:45.059066 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8561b7f2-0c2e-44bc-8f9f-be22c9624182" path="/var/lib/kubelet/pods/8561b7f2-0c2e-44bc-8f9f-be22c9624182/volumes" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.583539 4837 generic.go:334] "Generic (PLEG): container finished" podID="90028d66-5134-4c09-af15-71e754f49bf3" containerID="3707d1215c01b01e085f415765474c7f7c0f6cccb71ba878a0cb6e1bc6e40be6" exitCode=0 Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.583652 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"90028d66-5134-4c09-af15-71e754f49bf3","Type":"ContainerDied","Data":"3707d1215c01b01e085f415765474c7f7c0f6cccb71ba878a0cb6e1bc6e40be6"} Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.586077 4837 generic.go:334] "Generic (PLEG): container finished" podID="245e5a26-d143-4e4d-bae8-094275a91574" containerID="968024f3e7a34fb2251686b56e03a3d25328f059240b5fd48e62fa0112da765c" exitCode=0 Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.586123 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"245e5a26-d143-4e4d-bae8-094275a91574","Type":"ContainerDied","Data":"968024f3e7a34fb2251686b56e03a3d25328f059240b5fd48e62fa0112da765c"} Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.766459 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6"] Mar 13 12:10:55 crc kubenswrapper[4837]: E0313 12:10:55.766956 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d9c85e6-5c66-4c94-996b-0278453fd29c" containerName="dnsmasq-dns" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.766992 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d9c85e6-5c66-4c94-996b-0278453fd29c" containerName="dnsmasq-dns" Mar 13 12:10:55 crc kubenswrapper[4837]: E0313 12:10:55.767013 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d9c85e6-5c66-4c94-996b-0278453fd29c" containerName="init" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.767022 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d9c85e6-5c66-4c94-996b-0278453fd29c" containerName="init" Mar 13 12:10:55 crc kubenswrapper[4837]: E0313 12:10:55.767078 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8561b7f2-0c2e-44bc-8f9f-be22c9624182" containerName="init" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.767090 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8561b7f2-0c2e-44bc-8f9f-be22c9624182" containerName="init" Mar 13 12:10:55 crc kubenswrapper[4837]: E0313 12:10:55.767106 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8561b7f2-0c2e-44bc-8f9f-be22c9624182" containerName="dnsmasq-dns" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.767114 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8561b7f2-0c2e-44bc-8f9f-be22c9624182" containerName="dnsmasq-dns" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.767388 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8561b7f2-0c2e-44bc-8f9f-be22c9624182" containerName="dnsmasq-dns" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.767415 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d9c85e6-5c66-4c94-996b-0278453fd29c" containerName="dnsmasq-dns" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.768181 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.770732 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.770923 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.771119 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.771280 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.801107 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6"] Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.844860 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.844973 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.845308 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.845388 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc689\" (UniqueName: \"kubernetes.io/projected/bfedd3e5-e8d7-4311-9a0d-30276ce40418-kube-api-access-lc689\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.947028 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc689\" (UniqueName: \"kubernetes.io/projected/bfedd3e5-e8d7-4311-9a0d-30276ce40418-kube-api-access-lc689\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.947162 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.947191 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.947266 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.954304 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.954375 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.954930 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.964803 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc689\" (UniqueName: \"kubernetes.io/projected/bfedd3e5-e8d7-4311-9a0d-30276ce40418-kube-api-access-lc689\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:56 crc kubenswrapper[4837]: I0313 12:10:56.143870 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:56 crc kubenswrapper[4837]: I0313 12:10:56.607074 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"90028d66-5134-4c09-af15-71e754f49bf3","Type":"ContainerStarted","Data":"542ea034b4a23a312c031070a7d0e00e62cbf6d03c66433e44b8f1f7bad49766"} Mar 13 12:10:56 crc kubenswrapper[4837]: I0313 12:10:56.608759 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:56 crc kubenswrapper[4837]: I0313 12:10:56.611028 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"245e5a26-d143-4e4d-bae8-094275a91574","Type":"ContainerStarted","Data":"a06e916058107256eb45141b722629ae6ffc45e6bbd11377cfb66669bc921055"} Mar 13 12:10:56 crc kubenswrapper[4837]: I0313 12:10:56.611660 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 13 12:10:56 crc kubenswrapper[4837]: I0313 12:10:56.686038 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.686016439 podStartE2EDuration="36.686016439s" podCreationTimestamp="2026-03-13 12:10:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:56.683389956 +0000 UTC m=+1372.321656719" watchObservedRunningTime="2026-03-13 12:10:56.686016439 +0000 UTC m=+1372.324283202" Mar 13 12:10:56 crc kubenswrapper[4837]: I0313 12:10:56.689148 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.689127568 podStartE2EDuration="36.689127568s" podCreationTimestamp="2026-03-13 12:10:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:56.6534215 +0000 UTC m=+1372.291688263" watchObservedRunningTime="2026-03-13 12:10:56.689127568 +0000 UTC m=+1372.327394341" Mar 13 12:10:56 crc kubenswrapper[4837]: W0313 12:10:56.730066 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfedd3e5_e8d7_4311_9a0d_30276ce40418.slice/crio-a866d1865427553eddb46b92dbd201d722a968316d595895a86fe588ce4b2bdc WatchSource:0}: Error finding container a866d1865427553eddb46b92dbd201d722a968316d595895a86fe588ce4b2bdc: Status 404 returned error can't find the container with id a866d1865427553eddb46b92dbd201d722a968316d595895a86fe588ce4b2bdc Mar 13 12:10:56 crc kubenswrapper[4837]: I0313 12:10:56.730492 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6"] Mar 13 12:10:57 crc kubenswrapper[4837]: I0313 12:10:57.628855 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" event={"ID":"bfedd3e5-e8d7-4311-9a0d-30276ce40418","Type":"ContainerStarted","Data":"a866d1865427553eddb46b92dbd201d722a968316d595895a86fe588ce4b2bdc"} Mar 13 12:11:05 crc kubenswrapper[4837]: I0313 12:11:05.483805 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:11:05 crc kubenswrapper[4837]: I0313 12:11:05.484120 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:11:07 crc kubenswrapper[4837]: I0313 12:11:07.285088 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:11:07 crc kubenswrapper[4837]: I0313 12:11:07.765370 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" event={"ID":"bfedd3e5-e8d7-4311-9a0d-30276ce40418","Type":"ContainerStarted","Data":"d8a71036699a5c429f4958ee9f2fd50fed11eca00708b1169cdeb9b07548dda7"} Mar 13 12:11:07 crc kubenswrapper[4837]: I0313 12:11:07.790610 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" podStartSLOduration=2.241170697 podStartE2EDuration="12.790585644s" podCreationTimestamp="2026-03-13 12:10:55 +0000 UTC" firstStartedPulling="2026-03-13 12:10:56.73320192 +0000 UTC m=+1372.371468693" lastFinishedPulling="2026-03-13 12:11:07.282616877 +0000 UTC m=+1382.920883640" observedRunningTime="2026-03-13 12:11:07.78695733 +0000 UTC m=+1383.425224093" watchObservedRunningTime="2026-03-13 12:11:07.790585644 +0000 UTC m=+1383.428852407" Mar 13 12:11:10 crc kubenswrapper[4837]: I0313 12:11:10.642876 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 13 12:11:11 crc kubenswrapper[4837]: I0313 12:11:11.112794 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:11:17 crc kubenswrapper[4837]: I0313 12:11:17.853166 4837 generic.go:334] "Generic (PLEG): container finished" podID="bfedd3e5-e8d7-4311-9a0d-30276ce40418" containerID="d8a71036699a5c429f4958ee9f2fd50fed11eca00708b1169cdeb9b07548dda7" exitCode=0 Mar 13 12:11:17 crc kubenswrapper[4837]: I0313 12:11:17.853250 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" event={"ID":"bfedd3e5-e8d7-4311-9a0d-30276ce40418","Type":"ContainerDied","Data":"d8a71036699a5c429f4958ee9f2fd50fed11eca00708b1169cdeb9b07548dda7"} Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.294353 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.383433 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-ssh-key-openstack-edpm-ipam\") pod \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.383522 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-repo-setup-combined-ca-bundle\") pod \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.383551 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc689\" (UniqueName: \"kubernetes.io/projected/bfedd3e5-e8d7-4311-9a0d-30276ce40418-kube-api-access-lc689\") pod \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.383708 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-inventory\") pod \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.389680 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfedd3e5-e8d7-4311-9a0d-30276ce40418-kube-api-access-lc689" (OuterVolumeSpecName: "kube-api-access-lc689") pod "bfedd3e5-e8d7-4311-9a0d-30276ce40418" (UID: "bfedd3e5-e8d7-4311-9a0d-30276ce40418"). InnerVolumeSpecName "kube-api-access-lc689". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.393730 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "bfedd3e5-e8d7-4311-9a0d-30276ce40418" (UID: "bfedd3e5-e8d7-4311-9a0d-30276ce40418"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.416745 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bfedd3e5-e8d7-4311-9a0d-30276ce40418" (UID: "bfedd3e5-e8d7-4311-9a0d-30276ce40418"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.416791 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-inventory" (OuterVolumeSpecName: "inventory") pod "bfedd3e5-e8d7-4311-9a0d-30276ce40418" (UID: "bfedd3e5-e8d7-4311-9a0d-30276ce40418"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.486061 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.486098 4837 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.486110 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc689\" (UniqueName: \"kubernetes.io/projected/bfedd3e5-e8d7-4311-9a0d-30276ce40418-kube-api-access-lc689\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.486123 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.876450 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" event={"ID":"bfedd3e5-e8d7-4311-9a0d-30276ce40418","Type":"ContainerDied","Data":"a866d1865427553eddb46b92dbd201d722a968316d595895a86fe588ce4b2bdc"} Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.876497 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a866d1865427553eddb46b92dbd201d722a968316d595895a86fe588ce4b2bdc" Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.876558 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:19.992524 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt"] Mar 13 12:11:20 crc kubenswrapper[4837]: E0313 12:11:19.993325 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfedd3e5-e8d7-4311-9a0d-30276ce40418" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:19.993341 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfedd3e5-e8d7-4311-9a0d-30276ce40418" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:19.993696 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfedd3e5-e8d7-4311-9a0d-30276ce40418" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:19.994488 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.015883 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.015992 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.016036 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.016350 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.031535 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt"] Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.101703 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pz9nt\" (UID: \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.101904 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-959mk\" (UniqueName: \"kubernetes.io/projected/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-kube-api-access-959mk\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pz9nt\" (UID: \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.101943 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pz9nt\" (UID: \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.204702 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pz9nt\" (UID: \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.204918 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-959mk\" (UniqueName: \"kubernetes.io/projected/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-kube-api-access-959mk\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pz9nt\" (UID: \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.204981 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pz9nt\" (UID: \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.208990 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pz9nt\" (UID: \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.211423 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pz9nt\" (UID: \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.256861 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-959mk\" (UniqueName: \"kubernetes.io/projected/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-kube-api-access-959mk\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pz9nt\" (UID: \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.352729 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.871151 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt"] Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.889087 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" event={"ID":"0b7402b1-0b76-4ffa-b37f-6e014183f6a6","Type":"ContainerStarted","Data":"b89bf6808b23352420b0ada98d22d3af61da2bcbf1e90c1ae0b4a1eaddb6418a"} Mar 13 12:11:21 crc kubenswrapper[4837]: I0313 12:11:21.902766 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" event={"ID":"0b7402b1-0b76-4ffa-b37f-6e014183f6a6","Type":"ContainerStarted","Data":"45b5081c34f5edc0075777fcfa00e2d62084726d76624bcb0330bd64e206316b"} Mar 13 12:11:21 crc kubenswrapper[4837]: I0313 12:11:21.940152 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" podStartSLOduration=2.496164174 podStartE2EDuration="2.94012824s" podCreationTimestamp="2026-03-13 12:11:19 +0000 UTC" firstStartedPulling="2026-03-13 12:11:20.8758945 +0000 UTC m=+1396.514161263" lastFinishedPulling="2026-03-13 12:11:21.319858566 +0000 UTC m=+1396.958125329" observedRunningTime="2026-03-13 12:11:21.931866699 +0000 UTC m=+1397.570133502" watchObservedRunningTime="2026-03-13 12:11:21.94012824 +0000 UTC m=+1397.578395003" Mar 13 12:11:23 crc kubenswrapper[4837]: I0313 12:11:23.921726 4837 generic.go:334] "Generic (PLEG): container finished" podID="0b7402b1-0b76-4ffa-b37f-6e014183f6a6" containerID="45b5081c34f5edc0075777fcfa00e2d62084726d76624bcb0330bd64e206316b" exitCode=0 Mar 13 12:11:23 crc kubenswrapper[4837]: I0313 12:11:23.921832 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" event={"ID":"0b7402b1-0b76-4ffa-b37f-6e014183f6a6","Type":"ContainerDied","Data":"45b5081c34f5edc0075777fcfa00e2d62084726d76624bcb0330bd64e206316b"} Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.337971 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.502983 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-inventory\") pod \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\" (UID: \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\") " Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.503399 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-ssh-key-openstack-edpm-ipam\") pod \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\" (UID: \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\") " Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.503531 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-959mk\" (UniqueName: \"kubernetes.io/projected/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-kube-api-access-959mk\") pod \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\" (UID: \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\") " Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.511994 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-kube-api-access-959mk" (OuterVolumeSpecName: "kube-api-access-959mk") pod "0b7402b1-0b76-4ffa-b37f-6e014183f6a6" (UID: "0b7402b1-0b76-4ffa-b37f-6e014183f6a6"). InnerVolumeSpecName "kube-api-access-959mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.537153 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-inventory" (OuterVolumeSpecName: "inventory") pod "0b7402b1-0b76-4ffa-b37f-6e014183f6a6" (UID: "0b7402b1-0b76-4ffa-b37f-6e014183f6a6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.540402 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0b7402b1-0b76-4ffa-b37f-6e014183f6a6" (UID: "0b7402b1-0b76-4ffa-b37f-6e014183f6a6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.605440 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-959mk\" (UniqueName: \"kubernetes.io/projected/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-kube-api-access-959mk\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.605479 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.605493 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.941493 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" event={"ID":"0b7402b1-0b76-4ffa-b37f-6e014183f6a6","Type":"ContainerDied","Data":"b89bf6808b23352420b0ada98d22d3af61da2bcbf1e90c1ae0b4a1eaddb6418a"} Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.941549 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b89bf6808b23352420b0ada98d22d3af61da2bcbf1e90c1ae0b4a1eaddb6418a" Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.941824 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.417814 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj"] Mar 13 12:11:26 crc kubenswrapper[4837]: E0313 12:11:26.419028 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b7402b1-0b76-4ffa-b37f-6e014183f6a6" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.419151 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b7402b1-0b76-4ffa-b37f-6e014183f6a6" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.419423 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b7402b1-0b76-4ffa-b37f-6e014183f6a6" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.420301 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.422310 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.422578 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.422777 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.422971 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.426181 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj"] Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.520424 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.520568 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srznj\" (UniqueName: \"kubernetes.io/projected/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-kube-api-access-srznj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.520650 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.520687 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.622387 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.622963 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.623097 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.623254 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srznj\" (UniqueName: \"kubernetes.io/projected/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-kube-api-access-srznj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.629365 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.631429 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.632385 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.646704 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srznj\" (UniqueName: \"kubernetes.io/projected/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-kube-api-access-srznj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.737069 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:27 crc kubenswrapper[4837]: I0313 12:11:27.231460 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj"] Mar 13 12:11:27 crc kubenswrapper[4837]: I0313 12:11:27.960832 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" event={"ID":"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6","Type":"ContainerStarted","Data":"eb1333ce0764b093d73fd17e5289d7edb5cff5fe2036f478ee8e0d94f4ed2a16"} Mar 13 12:11:27 crc kubenswrapper[4837]: I0313 12:11:27.961212 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" event={"ID":"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6","Type":"ContainerStarted","Data":"011972819d057043c306599fed7fa5342801862959ffcdf2a7d97969e48cbd76"} Mar 13 12:11:27 crc kubenswrapper[4837]: I0313 12:11:27.985259 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" podStartSLOduration=1.551860845 podStartE2EDuration="1.985228845s" podCreationTimestamp="2026-03-13 12:11:26 +0000 UTC" firstStartedPulling="2026-03-13 12:11:27.232548828 +0000 UTC m=+1402.870815591" lastFinishedPulling="2026-03-13 12:11:27.665916798 +0000 UTC m=+1403.304183591" observedRunningTime="2026-03-13 12:11:27.98221523 +0000 UTC m=+1403.620481983" watchObservedRunningTime="2026-03-13 12:11:27.985228845 +0000 UTC m=+1403.623495608" Mar 13 12:11:35 crc kubenswrapper[4837]: I0313 12:11:35.484481 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:11:35 crc kubenswrapper[4837]: I0313 12:11:35.485103 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:11:35 crc kubenswrapper[4837]: I0313 12:11:35.485162 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 12:11:35 crc kubenswrapper[4837]: I0313 12:11:35.485978 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1c6d9e7e9de5c8ffd75bcf8d5717605d713d8068815596e54e918770c94282bc"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:11:35 crc kubenswrapper[4837]: I0313 12:11:35.486034 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://1c6d9e7e9de5c8ffd75bcf8d5717605d713d8068815596e54e918770c94282bc" gracePeriod=600 Mar 13 12:11:36 crc kubenswrapper[4837]: I0313 12:11:36.032467 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="1c6d9e7e9de5c8ffd75bcf8d5717605d713d8068815596e54e918770c94282bc" exitCode=0 Mar 13 12:11:36 crc kubenswrapper[4837]: I0313 12:11:36.032554 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"1c6d9e7e9de5c8ffd75bcf8d5717605d713d8068815596e54e918770c94282bc"} Mar 13 12:11:36 crc kubenswrapper[4837]: I0313 12:11:36.032752 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a"} Mar 13 12:11:36 crc kubenswrapper[4837]: I0313 12:11:36.032775 4837 scope.go:117] "RemoveContainer" containerID="75c6e15833f1c4c6d83b741f42f4ce0c9378844641d1d149fd75349d257dfc71" Mar 13 12:12:00 crc kubenswrapper[4837]: I0313 12:12:00.148592 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556732-84qfh"] Mar 13 12:12:00 crc kubenswrapper[4837]: I0313 12:12:00.150697 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556732-84qfh" Mar 13 12:12:00 crc kubenswrapper[4837]: I0313 12:12:00.154186 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:12:00 crc kubenswrapper[4837]: I0313 12:12:00.154418 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:12:00 crc kubenswrapper[4837]: I0313 12:12:00.154597 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:12:00 crc kubenswrapper[4837]: I0313 12:12:00.160749 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556732-84qfh"] Mar 13 12:12:00 crc kubenswrapper[4837]: I0313 12:12:00.305380 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9t7g\" (UniqueName: \"kubernetes.io/projected/564139cd-c95b-45c7-bf55-00c944313930-kube-api-access-k9t7g\") pod \"auto-csr-approver-29556732-84qfh\" (UID: \"564139cd-c95b-45c7-bf55-00c944313930\") " pod="openshift-infra/auto-csr-approver-29556732-84qfh" Mar 13 12:12:00 crc kubenswrapper[4837]: I0313 12:12:00.407429 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9t7g\" (UniqueName: \"kubernetes.io/projected/564139cd-c95b-45c7-bf55-00c944313930-kube-api-access-k9t7g\") pod \"auto-csr-approver-29556732-84qfh\" (UID: \"564139cd-c95b-45c7-bf55-00c944313930\") " pod="openshift-infra/auto-csr-approver-29556732-84qfh" Mar 13 12:12:00 crc kubenswrapper[4837]: I0313 12:12:00.426324 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9t7g\" (UniqueName: \"kubernetes.io/projected/564139cd-c95b-45c7-bf55-00c944313930-kube-api-access-k9t7g\") pod \"auto-csr-approver-29556732-84qfh\" (UID: \"564139cd-c95b-45c7-bf55-00c944313930\") " pod="openshift-infra/auto-csr-approver-29556732-84qfh" Mar 13 12:12:00 crc kubenswrapper[4837]: I0313 12:12:00.480695 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556732-84qfh" Mar 13 12:12:00 crc kubenswrapper[4837]: I0313 12:12:00.967856 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556732-84qfh"] Mar 13 12:12:00 crc kubenswrapper[4837]: W0313 12:12:00.970940 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod564139cd_c95b_45c7_bf55_00c944313930.slice/crio-c2159da86fd08eedf43b76b9b65dccf8e53beca1390d1dbd03003e21443c7f69 WatchSource:0}: Error finding container c2159da86fd08eedf43b76b9b65dccf8e53beca1390d1dbd03003e21443c7f69: Status 404 returned error can't find the container with id c2159da86fd08eedf43b76b9b65dccf8e53beca1390d1dbd03003e21443c7f69 Mar 13 12:12:01 crc kubenswrapper[4837]: I0313 12:12:01.262269 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556732-84qfh" event={"ID":"564139cd-c95b-45c7-bf55-00c944313930","Type":"ContainerStarted","Data":"c2159da86fd08eedf43b76b9b65dccf8e53beca1390d1dbd03003e21443c7f69"} Mar 13 12:12:03 crc kubenswrapper[4837]: I0313 12:12:03.296051 4837 generic.go:334] "Generic (PLEG): container finished" podID="564139cd-c95b-45c7-bf55-00c944313930" containerID="e317d41369cc2f3ddf2e1c831d3041b43d32e03d72c05e27b06993576c33a0e8" exitCode=0 Mar 13 12:12:03 crc kubenswrapper[4837]: I0313 12:12:03.296460 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556732-84qfh" event={"ID":"564139cd-c95b-45c7-bf55-00c944313930","Type":"ContainerDied","Data":"e317d41369cc2f3ddf2e1c831d3041b43d32e03d72c05e27b06993576c33a0e8"} Mar 13 12:12:04 crc kubenswrapper[4837]: I0313 12:12:04.614800 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556732-84qfh" Mar 13 12:12:04 crc kubenswrapper[4837]: I0313 12:12:04.792936 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9t7g\" (UniqueName: \"kubernetes.io/projected/564139cd-c95b-45c7-bf55-00c944313930-kube-api-access-k9t7g\") pod \"564139cd-c95b-45c7-bf55-00c944313930\" (UID: \"564139cd-c95b-45c7-bf55-00c944313930\") " Mar 13 12:12:04 crc kubenswrapper[4837]: I0313 12:12:04.800925 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/564139cd-c95b-45c7-bf55-00c944313930-kube-api-access-k9t7g" (OuterVolumeSpecName: "kube-api-access-k9t7g") pod "564139cd-c95b-45c7-bf55-00c944313930" (UID: "564139cd-c95b-45c7-bf55-00c944313930"). InnerVolumeSpecName "kube-api-access-k9t7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:12:04 crc kubenswrapper[4837]: I0313 12:12:04.895664 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9t7g\" (UniqueName: \"kubernetes.io/projected/564139cd-c95b-45c7-bf55-00c944313930-kube-api-access-k9t7g\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:05 crc kubenswrapper[4837]: I0313 12:12:05.316897 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556732-84qfh" event={"ID":"564139cd-c95b-45c7-bf55-00c944313930","Type":"ContainerDied","Data":"c2159da86fd08eedf43b76b9b65dccf8e53beca1390d1dbd03003e21443c7f69"} Mar 13 12:12:05 crc kubenswrapper[4837]: I0313 12:12:05.316944 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2159da86fd08eedf43b76b9b65dccf8e53beca1390d1dbd03003e21443c7f69" Mar 13 12:12:05 crc kubenswrapper[4837]: I0313 12:12:05.317017 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556732-84qfh" Mar 13 12:12:05 crc kubenswrapper[4837]: I0313 12:12:05.693410 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556726-gdbfm"] Mar 13 12:12:05 crc kubenswrapper[4837]: I0313 12:12:05.702611 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556726-gdbfm"] Mar 13 12:12:07 crc kubenswrapper[4837]: I0313 12:12:07.060687 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83f46fff-3510-4758-82a0-30099640fa33" path="/var/lib/kubelet/pods/83f46fff-3510-4758-82a0-30099640fa33/volumes" Mar 13 12:12:27 crc kubenswrapper[4837]: I0313 12:12:27.465546 4837 scope.go:117] "RemoveContainer" containerID="8b3e536f3d4311421b7a8a53f994fc3c95b97d5e112a955f101e290d9b221b2d" Mar 13 12:12:27 crc kubenswrapper[4837]: I0313 12:12:27.524304 4837 scope.go:117] "RemoveContainer" containerID="e03f96aaa50d1c9241f9e2fad6e8df257f1e78642de37d3f89872036b5b55220" Mar 13 12:12:27 crc kubenswrapper[4837]: I0313 12:12:27.544308 4837 scope.go:117] "RemoveContainer" containerID="1858aaffb80ca26b2ecab85a7aa907d93bda6b050db7fd69c55fcebb623536ef" Mar 13 12:12:27 crc kubenswrapper[4837]: I0313 12:12:27.609088 4837 scope.go:117] "RemoveContainer" containerID="a3d9d75be9f89d9ac614473e4e3a4f535965320bd55937576eb6b69f6cb8f8b9" Mar 13 12:12:27 crc kubenswrapper[4837]: I0313 12:12:27.641545 4837 scope.go:117] "RemoveContainer" containerID="57b8ae831c66c62748afbdcfeed21457125293d241eef5e2c9e04fa2bc86f046" Mar 13 12:12:27 crc kubenswrapper[4837]: I0313 12:12:27.661331 4837 scope.go:117] "RemoveContainer" containerID="f8f234cd31d0132024229747ad2a8277b3ce2f09009460632455703d08203032" Mar 13 12:13:27 crc kubenswrapper[4837]: I0313 12:13:27.809314 4837 scope.go:117] "RemoveContainer" containerID="40a0292d1dfe433f0d44082f12bb7e30ff5d447f6e395b1d7a570420f6252eeb" Mar 13 12:13:27 crc kubenswrapper[4837]: I0313 12:13:27.849075 4837 scope.go:117] "RemoveContainer" containerID="ccf4fdc9606b0ae8a6ecc82badd31da8c6fddc1f4294bee13d5805f8da627b43" Mar 13 12:13:35 crc kubenswrapper[4837]: I0313 12:13:35.484198 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:13:35 crc kubenswrapper[4837]: I0313 12:13:35.485209 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.154436 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556734-g7zt7"] Mar 13 12:14:00 crc kubenswrapper[4837]: E0313 12:14:00.155372 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564139cd-c95b-45c7-bf55-00c944313930" containerName="oc" Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.155415 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="564139cd-c95b-45c7-bf55-00c944313930" containerName="oc" Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.155668 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="564139cd-c95b-45c7-bf55-00c944313930" containerName="oc" Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.156485 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556734-g7zt7" Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.159104 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.159289 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.159462 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.174526 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556734-g7zt7"] Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.177279 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvpjq\" (UniqueName: \"kubernetes.io/projected/b41b916d-46ab-43e8-b624-bb1fb6aaf2f8-kube-api-access-kvpjq\") pod \"auto-csr-approver-29556734-g7zt7\" (UID: \"b41b916d-46ab-43e8-b624-bb1fb6aaf2f8\") " pod="openshift-infra/auto-csr-approver-29556734-g7zt7" Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.279341 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvpjq\" (UniqueName: \"kubernetes.io/projected/b41b916d-46ab-43e8-b624-bb1fb6aaf2f8-kube-api-access-kvpjq\") pod \"auto-csr-approver-29556734-g7zt7\" (UID: \"b41b916d-46ab-43e8-b624-bb1fb6aaf2f8\") " pod="openshift-infra/auto-csr-approver-29556734-g7zt7" Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.310347 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvpjq\" (UniqueName: \"kubernetes.io/projected/b41b916d-46ab-43e8-b624-bb1fb6aaf2f8-kube-api-access-kvpjq\") pod \"auto-csr-approver-29556734-g7zt7\" (UID: \"b41b916d-46ab-43e8-b624-bb1fb6aaf2f8\") " pod="openshift-infra/auto-csr-approver-29556734-g7zt7" Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.490363 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556734-g7zt7" Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.926576 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556734-g7zt7"] Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.934558 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:14:01 crc kubenswrapper[4837]: I0313 12:14:01.587993 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556734-g7zt7" event={"ID":"b41b916d-46ab-43e8-b624-bb1fb6aaf2f8","Type":"ContainerStarted","Data":"3f46648b6ee317959b85aab3af5418a8561594a01ed81bb2dd102242f281029f"} Mar 13 12:14:02 crc kubenswrapper[4837]: I0313 12:14:02.598662 4837 generic.go:334] "Generic (PLEG): container finished" podID="b41b916d-46ab-43e8-b624-bb1fb6aaf2f8" containerID="618f29cef46a018933eff3564372eb6b93270ae38a4b8bb52de53e9e241ebfba" exitCode=0 Mar 13 12:14:02 crc kubenswrapper[4837]: I0313 12:14:02.598765 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556734-g7zt7" event={"ID":"b41b916d-46ab-43e8-b624-bb1fb6aaf2f8","Type":"ContainerDied","Data":"618f29cef46a018933eff3564372eb6b93270ae38a4b8bb52de53e9e241ebfba"} Mar 13 12:14:03 crc kubenswrapper[4837]: I0313 12:14:03.914717 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556734-g7zt7" Mar 13 12:14:03 crc kubenswrapper[4837]: I0313 12:14:03.963260 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvpjq\" (UniqueName: \"kubernetes.io/projected/b41b916d-46ab-43e8-b624-bb1fb6aaf2f8-kube-api-access-kvpjq\") pod \"b41b916d-46ab-43e8-b624-bb1fb6aaf2f8\" (UID: \"b41b916d-46ab-43e8-b624-bb1fb6aaf2f8\") " Mar 13 12:14:03 crc kubenswrapper[4837]: I0313 12:14:03.970726 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b41b916d-46ab-43e8-b624-bb1fb6aaf2f8-kube-api-access-kvpjq" (OuterVolumeSpecName: "kube-api-access-kvpjq") pod "b41b916d-46ab-43e8-b624-bb1fb6aaf2f8" (UID: "b41b916d-46ab-43e8-b624-bb1fb6aaf2f8"). InnerVolumeSpecName "kube-api-access-kvpjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:14:04 crc kubenswrapper[4837]: I0313 12:14:04.065694 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvpjq\" (UniqueName: \"kubernetes.io/projected/b41b916d-46ab-43e8-b624-bb1fb6aaf2f8-kube-api-access-kvpjq\") on node \"crc\" DevicePath \"\"" Mar 13 12:14:04 crc kubenswrapper[4837]: I0313 12:14:04.618510 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556734-g7zt7" event={"ID":"b41b916d-46ab-43e8-b624-bb1fb6aaf2f8","Type":"ContainerDied","Data":"3f46648b6ee317959b85aab3af5418a8561594a01ed81bb2dd102242f281029f"} Mar 13 12:14:04 crc kubenswrapper[4837]: I0313 12:14:04.618590 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f46648b6ee317959b85aab3af5418a8561594a01ed81bb2dd102242f281029f" Mar 13 12:14:04 crc kubenswrapper[4837]: I0313 12:14:04.618595 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556734-g7zt7" Mar 13 12:14:04 crc kubenswrapper[4837]: I0313 12:14:04.982329 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556728-7n29h"] Mar 13 12:14:04 crc kubenswrapper[4837]: I0313 12:14:04.991074 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556728-7n29h"] Mar 13 12:14:05 crc kubenswrapper[4837]: I0313 12:14:05.059382 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47ae408b-faad-4a52-ad09-428242645381" path="/var/lib/kubelet/pods/47ae408b-faad-4a52-ad09-428242645381/volumes" Mar 13 12:14:05 crc kubenswrapper[4837]: I0313 12:14:05.484471 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:14:05 crc kubenswrapper[4837]: I0313 12:14:05.484532 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:14:22 crc kubenswrapper[4837]: I0313 12:14:22.799249 4837 generic.go:334] "Generic (PLEG): container finished" podID="2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6" containerID="eb1333ce0764b093d73fd17e5289d7edb5cff5fe2036f478ee8e0d94f4ed2a16" exitCode=0 Mar 13 12:14:22 crc kubenswrapper[4837]: I0313 12:14:22.799929 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" event={"ID":"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6","Type":"ContainerDied","Data":"eb1333ce0764b093d73fd17e5289d7edb5cff5fe2036f478ee8e0d94f4ed2a16"} Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.200193 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.284492 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-ssh-key-openstack-edpm-ipam\") pod \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.284660 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-bootstrap-combined-ca-bundle\") pod \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.284859 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srznj\" (UniqueName: \"kubernetes.io/projected/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-kube-api-access-srznj\") pod \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.284950 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-inventory\") pod \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.300365 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6" (UID: "2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.300470 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-kube-api-access-srznj" (OuterVolumeSpecName: "kube-api-access-srznj") pod "2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6" (UID: "2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6"). InnerVolumeSpecName "kube-api-access-srznj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.312524 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-inventory" (OuterVolumeSpecName: "inventory") pod "2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6" (UID: "2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.313091 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6" (UID: "2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.387982 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.388020 4837 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.388030 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srznj\" (UniqueName: \"kubernetes.io/projected/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-kube-api-access-srznj\") on node \"crc\" DevicePath \"\"" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.388040 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.817263 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" event={"ID":"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6","Type":"ContainerDied","Data":"011972819d057043c306599fed7fa5342801862959ffcdf2a7d97969e48cbd76"} Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.817312 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="011972819d057043c306599fed7fa5342801862959ffcdf2a7d97969e48cbd76" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.817350 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.901615 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts"] Mar 13 12:14:24 crc kubenswrapper[4837]: E0313 12:14:24.902345 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.902470 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 13 12:14:24 crc kubenswrapper[4837]: E0313 12:14:24.902592 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b41b916d-46ab-43e8-b624-bb1fb6aaf2f8" containerName="oc" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.902691 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b41b916d-46ab-43e8-b624-bb1fb6aaf2f8" containerName="oc" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.903048 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="b41b916d-46ab-43e8-b624-bb1fb6aaf2f8" containerName="oc" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.903169 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.904044 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.907463 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.907514 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.907521 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.907914 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.916304 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts"] Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.998247 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6b22\" (UniqueName: \"kubernetes.io/projected/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-kube-api-access-g6b22\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts\" (UID: \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.998316 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts\" (UID: \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.998406 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts\" (UID: \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:14:25 crc kubenswrapper[4837]: I0313 12:14:25.100492 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts\" (UID: \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:14:25 crc kubenswrapper[4837]: I0313 12:14:25.100722 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts\" (UID: \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:14:25 crc kubenswrapper[4837]: I0313 12:14:25.100816 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6b22\" (UniqueName: \"kubernetes.io/projected/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-kube-api-access-g6b22\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts\" (UID: \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:14:25 crc kubenswrapper[4837]: I0313 12:14:25.106018 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts\" (UID: \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:14:25 crc kubenswrapper[4837]: I0313 12:14:25.106022 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts\" (UID: \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:14:25 crc kubenswrapper[4837]: I0313 12:14:25.121180 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6b22\" (UniqueName: \"kubernetes.io/projected/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-kube-api-access-g6b22\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts\" (UID: \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:14:25 crc kubenswrapper[4837]: I0313 12:14:25.222434 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:14:25 crc kubenswrapper[4837]: I0313 12:14:25.726340 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts"] Mar 13 12:14:25 crc kubenswrapper[4837]: I0313 12:14:25.825445 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" event={"ID":"121f6d1b-1277-4d68-8a48-6c4630dd6fe5","Type":"ContainerStarted","Data":"ef1d62890ff7c257d9b17342f3219ac6a0097e8282e72a43ca3f00b10ba0d794"} Mar 13 12:14:26 crc kubenswrapper[4837]: I0313 12:14:26.836601 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" event={"ID":"121f6d1b-1277-4d68-8a48-6c4630dd6fe5","Type":"ContainerStarted","Data":"846a6e75ac7966b1f1da247e3de2868e0139228fe381ae0693bde11ff4d07f27"} Mar 13 12:14:26 crc kubenswrapper[4837]: I0313 12:14:26.862404 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" podStartSLOduration=2.033479004 podStartE2EDuration="2.862383208s" podCreationTimestamp="2026-03-13 12:14:24 +0000 UTC" firstStartedPulling="2026-03-13 12:14:25.722573792 +0000 UTC m=+1581.360840555" lastFinishedPulling="2026-03-13 12:14:26.551477956 +0000 UTC m=+1582.189744759" observedRunningTime="2026-03-13 12:14:26.853420486 +0000 UTC m=+1582.491687249" watchObservedRunningTime="2026-03-13 12:14:26.862383208 +0000 UTC m=+1582.500649971" Mar 13 12:14:28 crc kubenswrapper[4837]: I0313 12:14:28.094329 4837 scope.go:117] "RemoveContainer" containerID="d2184d47fa1ce72a82da97184468ccee1cece609eb9ab8fb1194680ef9c8ea21" Mar 13 12:14:35 crc kubenswrapper[4837]: I0313 12:14:35.484293 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:14:35 crc kubenswrapper[4837]: I0313 12:14:35.484940 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:14:35 crc kubenswrapper[4837]: I0313 12:14:35.485001 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 12:14:35 crc kubenswrapper[4837]: I0313 12:14:35.485899 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:14:35 crc kubenswrapper[4837]: I0313 12:14:35.485958 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" gracePeriod=600 Mar 13 12:14:35 crc kubenswrapper[4837]: E0313 12:14:35.629686 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:14:35 crc kubenswrapper[4837]: I0313 12:14:35.927384 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" exitCode=0 Mar 13 12:14:35 crc kubenswrapper[4837]: I0313 12:14:35.927438 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a"} Mar 13 12:14:35 crc kubenswrapper[4837]: I0313 12:14:35.927487 4837 scope.go:117] "RemoveContainer" containerID="1c6d9e7e9de5c8ffd75bcf8d5717605d713d8068815596e54e918770c94282bc" Mar 13 12:14:35 crc kubenswrapper[4837]: I0313 12:14:35.928219 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:14:35 crc kubenswrapper[4837]: E0313 12:14:35.928484 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:14:49 crc kubenswrapper[4837]: I0313 12:14:49.048373 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:14:49 crc kubenswrapper[4837]: E0313 12:14:49.049098 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.150780 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8"] Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.152394 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.154555 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.155273 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.177333 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8"] Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.314917 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c6ce131-8677-48bc-8f07-b53837bd751b-config-volume\") pod \"collect-profiles-29556735-548l8\" (UID: \"3c6ce131-8677-48bc-8f07-b53837bd751b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.314990 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dht88\" (UniqueName: \"kubernetes.io/projected/3c6ce131-8677-48bc-8f07-b53837bd751b-kube-api-access-dht88\") pod \"collect-profiles-29556735-548l8\" (UID: \"3c6ce131-8677-48bc-8f07-b53837bd751b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.315364 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c6ce131-8677-48bc-8f07-b53837bd751b-secret-volume\") pod \"collect-profiles-29556735-548l8\" (UID: \"3c6ce131-8677-48bc-8f07-b53837bd751b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.416586 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dht88\" (UniqueName: \"kubernetes.io/projected/3c6ce131-8677-48bc-8f07-b53837bd751b-kube-api-access-dht88\") pod \"collect-profiles-29556735-548l8\" (UID: \"3c6ce131-8677-48bc-8f07-b53837bd751b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.416736 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c6ce131-8677-48bc-8f07-b53837bd751b-secret-volume\") pod \"collect-profiles-29556735-548l8\" (UID: \"3c6ce131-8677-48bc-8f07-b53837bd751b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.416834 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c6ce131-8677-48bc-8f07-b53837bd751b-config-volume\") pod \"collect-profiles-29556735-548l8\" (UID: \"3c6ce131-8677-48bc-8f07-b53837bd751b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.417604 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c6ce131-8677-48bc-8f07-b53837bd751b-config-volume\") pod \"collect-profiles-29556735-548l8\" (UID: \"3c6ce131-8677-48bc-8f07-b53837bd751b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.428480 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c6ce131-8677-48bc-8f07-b53837bd751b-secret-volume\") pod \"collect-profiles-29556735-548l8\" (UID: \"3c6ce131-8677-48bc-8f07-b53837bd751b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.434576 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dht88\" (UniqueName: \"kubernetes.io/projected/3c6ce131-8677-48bc-8f07-b53837bd751b-kube-api-access-dht88\") pod \"collect-profiles-29556735-548l8\" (UID: \"3c6ce131-8677-48bc-8f07-b53837bd751b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.473468 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.930497 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8"] Mar 13 12:15:01 crc kubenswrapper[4837]: I0313 12:15:01.050528 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:15:01 crc kubenswrapper[4837]: E0313 12:15:01.050812 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:15:01 crc kubenswrapper[4837]: I0313 12:15:01.160465 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" event={"ID":"3c6ce131-8677-48bc-8f07-b53837bd751b","Type":"ContainerStarted","Data":"d444c3350e9c8e5cb2de80b3a01e0398d12ecfb5f36d63e1963c4019db354d3b"} Mar 13 12:15:01 crc kubenswrapper[4837]: I0313 12:15:01.160514 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" event={"ID":"3c6ce131-8677-48bc-8f07-b53837bd751b","Type":"ContainerStarted","Data":"2919ea9a2d0b93f84edb2e77c66e275804e6976a3a742b763c61cae4d408b047"} Mar 13 12:15:01 crc kubenswrapper[4837]: I0313 12:15:01.187524 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" podStartSLOduration=1.187501544 podStartE2EDuration="1.187501544s" podCreationTimestamp="2026-03-13 12:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:15:01.176348452 +0000 UTC m=+1616.814615215" watchObservedRunningTime="2026-03-13 12:15:01.187501544 +0000 UTC m=+1616.825768307" Mar 13 12:15:02 crc kubenswrapper[4837]: I0313 12:15:02.177606 4837 generic.go:334] "Generic (PLEG): container finished" podID="3c6ce131-8677-48bc-8f07-b53837bd751b" containerID="d444c3350e9c8e5cb2de80b3a01e0398d12ecfb5f36d63e1963c4019db354d3b" exitCode=0 Mar 13 12:15:02 crc kubenswrapper[4837]: I0313 12:15:02.177670 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" event={"ID":"3c6ce131-8677-48bc-8f07-b53837bd751b","Type":"ContainerDied","Data":"d444c3350e9c8e5cb2de80b3a01e0398d12ecfb5f36d63e1963c4019db354d3b"} Mar 13 12:15:03 crc kubenswrapper[4837]: I0313 12:15:03.563206 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:03 crc kubenswrapper[4837]: I0313 12:15:03.675887 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c6ce131-8677-48bc-8f07-b53837bd751b-secret-volume\") pod \"3c6ce131-8677-48bc-8f07-b53837bd751b\" (UID: \"3c6ce131-8677-48bc-8f07-b53837bd751b\") " Mar 13 12:15:03 crc kubenswrapper[4837]: I0313 12:15:03.675976 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dht88\" (UniqueName: \"kubernetes.io/projected/3c6ce131-8677-48bc-8f07-b53837bd751b-kube-api-access-dht88\") pod \"3c6ce131-8677-48bc-8f07-b53837bd751b\" (UID: \"3c6ce131-8677-48bc-8f07-b53837bd751b\") " Mar 13 12:15:03 crc kubenswrapper[4837]: I0313 12:15:03.676216 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c6ce131-8677-48bc-8f07-b53837bd751b-config-volume\") pod \"3c6ce131-8677-48bc-8f07-b53837bd751b\" (UID: \"3c6ce131-8677-48bc-8f07-b53837bd751b\") " Mar 13 12:15:03 crc kubenswrapper[4837]: I0313 12:15:03.676613 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c6ce131-8677-48bc-8f07-b53837bd751b-config-volume" (OuterVolumeSpecName: "config-volume") pod "3c6ce131-8677-48bc-8f07-b53837bd751b" (UID: "3c6ce131-8677-48bc-8f07-b53837bd751b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:15:03 crc kubenswrapper[4837]: I0313 12:15:03.682139 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c6ce131-8677-48bc-8f07-b53837bd751b-kube-api-access-dht88" (OuterVolumeSpecName: "kube-api-access-dht88") pod "3c6ce131-8677-48bc-8f07-b53837bd751b" (UID: "3c6ce131-8677-48bc-8f07-b53837bd751b"). InnerVolumeSpecName "kube-api-access-dht88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:15:03 crc kubenswrapper[4837]: I0313 12:15:03.682263 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c6ce131-8677-48bc-8f07-b53837bd751b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3c6ce131-8677-48bc-8f07-b53837bd751b" (UID: "3c6ce131-8677-48bc-8f07-b53837bd751b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:15:03 crc kubenswrapper[4837]: I0313 12:15:03.779088 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c6ce131-8677-48bc-8f07-b53837bd751b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:03 crc kubenswrapper[4837]: I0313 12:15:03.779398 4837 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c6ce131-8677-48bc-8f07-b53837bd751b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:03 crc kubenswrapper[4837]: I0313 12:15:03.779412 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dht88\" (UniqueName: \"kubernetes.io/projected/3c6ce131-8677-48bc-8f07-b53837bd751b-kube-api-access-dht88\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:04 crc kubenswrapper[4837]: I0313 12:15:04.198161 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" event={"ID":"3c6ce131-8677-48bc-8f07-b53837bd751b","Type":"ContainerDied","Data":"2919ea9a2d0b93f84edb2e77c66e275804e6976a3a742b763c61cae4d408b047"} Mar 13 12:15:04 crc kubenswrapper[4837]: I0313 12:15:04.198211 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2919ea9a2d0b93f84edb2e77c66e275804e6976a3a742b763c61cae4d408b047" Mar 13 12:15:04 crc kubenswrapper[4837]: I0313 12:15:04.198265 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:14 crc kubenswrapper[4837]: I0313 12:15:14.053518 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:15:14 crc kubenswrapper[4837]: E0313 12:15:14.054898 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:15:28 crc kubenswrapper[4837]: I0313 12:15:28.048842 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:15:28 crc kubenswrapper[4837]: E0313 12:15:28.049605 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:15:28 crc kubenswrapper[4837]: I0313 12:15:28.180352 4837 scope.go:117] "RemoveContainer" containerID="2b72c4b74ac632994ae39578139216d840009de89378dfe0823503769ad992b6" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.045130 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k4549"] Mar 13 12:15:38 crc kubenswrapper[4837]: E0313 12:15:38.045853 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c6ce131-8677-48bc-8f07-b53837bd751b" containerName="collect-profiles" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.045871 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c6ce131-8677-48bc-8f07-b53837bd751b" containerName="collect-profiles" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.046128 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c6ce131-8677-48bc-8f07-b53837bd751b" containerName="collect-profiles" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.048015 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.064699 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k4549"] Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.162311 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-utilities\") pod \"community-operators-k4549\" (UID: \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\") " pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.162399 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-catalog-content\") pod \"community-operators-k4549\" (UID: \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\") " pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.162617 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgz7b\" (UniqueName: \"kubernetes.io/projected/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-kube-api-access-cgz7b\") pod \"community-operators-k4549\" (UID: \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\") " pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.264625 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-utilities\") pod \"community-operators-k4549\" (UID: \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\") " pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.264968 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-catalog-content\") pod \"community-operators-k4549\" (UID: \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\") " pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.265182 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-utilities\") pod \"community-operators-k4549\" (UID: \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\") " pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.265339 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgz7b\" (UniqueName: \"kubernetes.io/projected/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-kube-api-access-cgz7b\") pod \"community-operators-k4549\" (UID: \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\") " pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.265538 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-catalog-content\") pod \"community-operators-k4549\" (UID: \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\") " pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.284387 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgz7b\" (UniqueName: \"kubernetes.io/projected/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-kube-api-access-cgz7b\") pod \"community-operators-k4549\" (UID: \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\") " pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.374498 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.874137 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k4549"] Mar 13 12:15:39 crc kubenswrapper[4837]: I0313 12:15:39.552792 4837 generic.go:334] "Generic (PLEG): container finished" podID="b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" containerID="cfa7b2fa10a0f7aa076cbf31c5b4937fe47ce2198c6b9fe0219231b848a6a60d" exitCode=0 Mar 13 12:15:39 crc kubenswrapper[4837]: I0313 12:15:39.552925 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4549" event={"ID":"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66","Type":"ContainerDied","Data":"cfa7b2fa10a0f7aa076cbf31c5b4937fe47ce2198c6b9fe0219231b848a6a60d"} Mar 13 12:15:39 crc kubenswrapper[4837]: I0313 12:15:39.553133 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4549" event={"ID":"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66","Type":"ContainerStarted","Data":"dcd16009bc1432999865446905a1cb6d82986e53bcf2a6d43b335dbf163a3472"} Mar 13 12:15:40 crc kubenswrapper[4837]: I0313 12:15:40.048145 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:15:40 crc kubenswrapper[4837]: E0313 12:15:40.048445 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:15:40 crc kubenswrapper[4837]: I0313 12:15:40.566976 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4549" event={"ID":"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66","Type":"ContainerStarted","Data":"7a4a3be12aa2fb097f4306fe7869dec5b8e0644f7a601b973aa63aeea41540bf"} Mar 13 12:15:41 crc kubenswrapper[4837]: I0313 12:15:41.581481 4837 generic.go:334] "Generic (PLEG): container finished" podID="b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" containerID="7a4a3be12aa2fb097f4306fe7869dec5b8e0644f7a601b973aa63aeea41540bf" exitCode=0 Mar 13 12:15:41 crc kubenswrapper[4837]: I0313 12:15:41.581626 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4549" event={"ID":"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66","Type":"ContainerDied","Data":"7a4a3be12aa2fb097f4306fe7869dec5b8e0644f7a601b973aa63aeea41540bf"} Mar 13 12:15:42 crc kubenswrapper[4837]: I0313 12:15:42.591991 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4549" event={"ID":"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66","Type":"ContainerStarted","Data":"699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827"} Mar 13 12:15:42 crc kubenswrapper[4837]: I0313 12:15:42.612698 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k4549" podStartSLOduration=2.098638901 podStartE2EDuration="4.61267534s" podCreationTimestamp="2026-03-13 12:15:38 +0000 UTC" firstStartedPulling="2026-03-13 12:15:39.554951186 +0000 UTC m=+1655.193217949" lastFinishedPulling="2026-03-13 12:15:42.068987625 +0000 UTC m=+1657.707254388" observedRunningTime="2026-03-13 12:15:42.608271281 +0000 UTC m=+1658.246538044" watchObservedRunningTime="2026-03-13 12:15:42.61267534 +0000 UTC m=+1658.250942103" Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.428581 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-66kqz"] Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.432000 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.450814 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-66kqz"] Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.498925 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39642113-74ee-406e-9ffa-5b1f8a86f0a3-utilities\") pod \"redhat-marketplace-66kqz\" (UID: \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\") " pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.499146 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c96br\" (UniqueName: \"kubernetes.io/projected/39642113-74ee-406e-9ffa-5b1f8a86f0a3-kube-api-access-c96br\") pod \"redhat-marketplace-66kqz\" (UID: \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\") " pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.500062 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39642113-74ee-406e-9ffa-5b1f8a86f0a3-catalog-content\") pod \"redhat-marketplace-66kqz\" (UID: \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\") " pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.602234 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39642113-74ee-406e-9ffa-5b1f8a86f0a3-catalog-content\") pod \"redhat-marketplace-66kqz\" (UID: \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\") " pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.602313 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39642113-74ee-406e-9ffa-5b1f8a86f0a3-utilities\") pod \"redhat-marketplace-66kqz\" (UID: \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\") " pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.602374 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c96br\" (UniqueName: \"kubernetes.io/projected/39642113-74ee-406e-9ffa-5b1f8a86f0a3-kube-api-access-c96br\") pod \"redhat-marketplace-66kqz\" (UID: \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\") " pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.602729 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39642113-74ee-406e-9ffa-5b1f8a86f0a3-catalog-content\") pod \"redhat-marketplace-66kqz\" (UID: \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\") " pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.602776 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39642113-74ee-406e-9ffa-5b1f8a86f0a3-utilities\") pod \"redhat-marketplace-66kqz\" (UID: \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\") " pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.621792 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c96br\" (UniqueName: \"kubernetes.io/projected/39642113-74ee-406e-9ffa-5b1f8a86f0a3-kube-api-access-c96br\") pod \"redhat-marketplace-66kqz\" (UID: \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\") " pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.767196 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:46 crc kubenswrapper[4837]: I0313 12:15:46.053467 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-n42jz"] Mar 13 12:15:46 crc kubenswrapper[4837]: I0313 12:15:46.076436 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-a48b-account-create-update-ckblt"] Mar 13 12:15:46 crc kubenswrapper[4837]: I0313 12:15:46.091727 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-n42jz"] Mar 13 12:15:46 crc kubenswrapper[4837]: I0313 12:15:46.101431 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-a48b-account-create-update-ckblt"] Mar 13 12:15:46 crc kubenswrapper[4837]: I0313 12:15:46.215363 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-66kqz"] Mar 13 12:15:46 crc kubenswrapper[4837]: W0313 12:15:46.217219 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39642113_74ee_406e_9ffa_5b1f8a86f0a3.slice/crio-65f583c268758f75b55fc8d8b1b7841f85ff3b2eb56e34fd425055fd65c76ae8 WatchSource:0}: Error finding container 65f583c268758f75b55fc8d8b1b7841f85ff3b2eb56e34fd425055fd65c76ae8: Status 404 returned error can't find the container with id 65f583c268758f75b55fc8d8b1b7841f85ff3b2eb56e34fd425055fd65c76ae8 Mar 13 12:15:46 crc kubenswrapper[4837]: I0313 12:15:46.638155 4837 generic.go:334] "Generic (PLEG): container finished" podID="39642113-74ee-406e-9ffa-5b1f8a86f0a3" containerID="376b0d55207197a286fb352a3a99bb1a2ae96c88906c793388f162a064b37a64" exitCode=0 Mar 13 12:15:46 crc kubenswrapper[4837]: I0313 12:15:46.638206 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66kqz" event={"ID":"39642113-74ee-406e-9ffa-5b1f8a86f0a3","Type":"ContainerDied","Data":"376b0d55207197a286fb352a3a99bb1a2ae96c88906c793388f162a064b37a64"} Mar 13 12:15:46 crc kubenswrapper[4837]: I0313 12:15:46.638232 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66kqz" event={"ID":"39642113-74ee-406e-9ffa-5b1f8a86f0a3","Type":"ContainerStarted","Data":"65f583c268758f75b55fc8d8b1b7841f85ff3b2eb56e34fd425055fd65c76ae8"} Mar 13 12:15:47 crc kubenswrapper[4837]: I0313 12:15:47.066447 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28320b08-9dde-491d-b151-21f93395bf10" path="/var/lib/kubelet/pods/28320b08-9dde-491d-b151-21f93395bf10/volumes" Mar 13 12:15:47 crc kubenswrapper[4837]: I0313 12:15:47.067427 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2936dcb-f1fa-446b-b20f-87e09a9c03ee" path="/var/lib/kubelet/pods/f2936dcb-f1fa-446b-b20f-87e09a9c03ee/volumes" Mar 13 12:15:48 crc kubenswrapper[4837]: I0313 12:15:48.375848 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:48 crc kubenswrapper[4837]: I0313 12:15:48.376357 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:48 crc kubenswrapper[4837]: I0313 12:15:48.427924 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:48 crc kubenswrapper[4837]: I0313 12:15:48.656314 4837 generic.go:334] "Generic (PLEG): container finished" podID="39642113-74ee-406e-9ffa-5b1f8a86f0a3" containerID="90f4bdf2a9038cc5810f20ee433a0ea49eba585b46346568ad1d765738079425" exitCode=0 Mar 13 12:15:48 crc kubenswrapper[4837]: I0313 12:15:48.656825 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66kqz" event={"ID":"39642113-74ee-406e-9ffa-5b1f8a86f0a3","Type":"ContainerDied","Data":"90f4bdf2a9038cc5810f20ee433a0ea49eba585b46346568ad1d765738079425"} Mar 13 12:15:48 crc kubenswrapper[4837]: I0313 12:15:48.720450 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:49 crc kubenswrapper[4837]: I0313 12:15:49.668051 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66kqz" event={"ID":"39642113-74ee-406e-9ffa-5b1f8a86f0a3","Type":"ContainerStarted","Data":"89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb"} Mar 13 12:15:49 crc kubenswrapper[4837]: I0313 12:15:49.686740 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-66kqz" podStartSLOduration=2.160361541 podStartE2EDuration="4.68672236s" podCreationTimestamp="2026-03-13 12:15:45 +0000 UTC" firstStartedPulling="2026-03-13 12:15:46.639932671 +0000 UTC m=+1662.278199434" lastFinishedPulling="2026-03-13 12:15:49.16629349 +0000 UTC m=+1664.804560253" observedRunningTime="2026-03-13 12:15:49.684142579 +0000 UTC m=+1665.322409362" watchObservedRunningTime="2026-03-13 12:15:49.68672236 +0000 UTC m=+1665.324989123" Mar 13 12:15:50 crc kubenswrapper[4837]: I0313 12:15:50.420549 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k4549"] Mar 13 12:15:50 crc kubenswrapper[4837]: I0313 12:15:50.675317 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k4549" podUID="b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" containerName="registry-server" containerID="cri-o://699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827" gracePeriod=2 Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.045080 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d9fb-account-create-update-5jvwd"] Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.080127 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d9fb-account-create-update-5jvwd"] Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.211918 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.349177 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-catalog-content\") pod \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\" (UID: \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\") " Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.349235 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-utilities\") pod \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\" (UID: \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\") " Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.349427 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgz7b\" (UniqueName: \"kubernetes.io/projected/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-kube-api-access-cgz7b\") pod \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\" (UID: \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\") " Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.350078 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-utilities" (OuterVolumeSpecName: "utilities") pod "b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" (UID: "b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.360920 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-kube-api-access-cgz7b" (OuterVolumeSpecName: "kube-api-access-cgz7b") pod "b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" (UID: "b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66"). InnerVolumeSpecName "kube-api-access-cgz7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.398157 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" (UID: "b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.451809 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgz7b\" (UniqueName: \"kubernetes.io/projected/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-kube-api-access-cgz7b\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.451852 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.451864 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.684590 4837 generic.go:334] "Generic (PLEG): container finished" podID="b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" containerID="699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827" exitCode=0 Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.684682 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4549" event={"ID":"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66","Type":"ContainerDied","Data":"699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827"} Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.684743 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4549" event={"ID":"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66","Type":"ContainerDied","Data":"dcd16009bc1432999865446905a1cb6d82986e53bcf2a6d43b335dbf163a3472"} Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.684717 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.684765 4837 scope.go:117] "RemoveContainer" containerID="699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.687992 4837 generic.go:334] "Generic (PLEG): container finished" podID="121f6d1b-1277-4d68-8a48-6c4630dd6fe5" containerID="846a6e75ac7966b1f1da247e3de2868e0139228fe381ae0693bde11ff4d07f27" exitCode=0 Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.688044 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" event={"ID":"121f6d1b-1277-4d68-8a48-6c4630dd6fe5","Type":"ContainerDied","Data":"846a6e75ac7966b1f1da247e3de2868e0139228fe381ae0693bde11ff4d07f27"} Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.710320 4837 scope.go:117] "RemoveContainer" containerID="7a4a3be12aa2fb097f4306fe7869dec5b8e0644f7a601b973aa63aeea41540bf" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.748284 4837 scope.go:117] "RemoveContainer" containerID="cfa7b2fa10a0f7aa076cbf31c5b4937fe47ce2198c6b9fe0219231b848a6a60d" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.764051 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k4549"] Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.775056 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k4549"] Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.794663 4837 scope.go:117] "RemoveContainer" containerID="699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827" Mar 13 12:15:51 crc kubenswrapper[4837]: E0313 12:15:51.795219 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827\": container with ID starting with 699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827 not found: ID does not exist" containerID="699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.795273 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827"} err="failed to get container status \"699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827\": rpc error: code = NotFound desc = could not find container \"699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827\": container with ID starting with 699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827 not found: ID does not exist" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.795303 4837 scope.go:117] "RemoveContainer" containerID="7a4a3be12aa2fb097f4306fe7869dec5b8e0644f7a601b973aa63aeea41540bf" Mar 13 12:15:51 crc kubenswrapper[4837]: E0313 12:15:51.795664 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a4a3be12aa2fb097f4306fe7869dec5b8e0644f7a601b973aa63aeea41540bf\": container with ID starting with 7a4a3be12aa2fb097f4306fe7869dec5b8e0644f7a601b973aa63aeea41540bf not found: ID does not exist" containerID="7a4a3be12aa2fb097f4306fe7869dec5b8e0644f7a601b973aa63aeea41540bf" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.795734 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a4a3be12aa2fb097f4306fe7869dec5b8e0644f7a601b973aa63aeea41540bf"} err="failed to get container status \"7a4a3be12aa2fb097f4306fe7869dec5b8e0644f7a601b973aa63aeea41540bf\": rpc error: code = NotFound desc = could not find container \"7a4a3be12aa2fb097f4306fe7869dec5b8e0644f7a601b973aa63aeea41540bf\": container with ID starting with 7a4a3be12aa2fb097f4306fe7869dec5b8e0644f7a601b973aa63aeea41540bf not found: ID does not exist" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.795780 4837 scope.go:117] "RemoveContainer" containerID="cfa7b2fa10a0f7aa076cbf31c5b4937fe47ce2198c6b9fe0219231b848a6a60d" Mar 13 12:15:51 crc kubenswrapper[4837]: E0313 12:15:51.796172 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfa7b2fa10a0f7aa076cbf31c5b4937fe47ce2198c6b9fe0219231b848a6a60d\": container with ID starting with cfa7b2fa10a0f7aa076cbf31c5b4937fe47ce2198c6b9fe0219231b848a6a60d not found: ID does not exist" containerID="cfa7b2fa10a0f7aa076cbf31c5b4937fe47ce2198c6b9fe0219231b848a6a60d" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.796218 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfa7b2fa10a0f7aa076cbf31c5b4937fe47ce2198c6b9fe0219231b848a6a60d"} err="failed to get container status \"cfa7b2fa10a0f7aa076cbf31c5b4937fe47ce2198c6b9fe0219231b848a6a60d\": rpc error: code = NotFound desc = could not find container \"cfa7b2fa10a0f7aa076cbf31c5b4937fe47ce2198c6b9fe0219231b848a6a60d\": container with ID starting with cfa7b2fa10a0f7aa076cbf31c5b4937fe47ce2198c6b9fe0219231b848a6a60d not found: ID does not exist" Mar 13 12:15:52 crc kubenswrapper[4837]: I0313 12:15:52.037466 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-rb248"] Mar 13 12:15:52 crc kubenswrapper[4837]: I0313 12:15:52.063723 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-d970-account-create-update-lkc7z"] Mar 13 12:15:52 crc kubenswrapper[4837]: I0313 12:15:52.077429 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-gmczg"] Mar 13 12:15:52 crc kubenswrapper[4837]: I0313 12:15:52.089180 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-rb248"] Mar 13 12:15:52 crc kubenswrapper[4837]: I0313 12:15:52.099705 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-d970-account-create-update-lkc7z"] Mar 13 12:15:52 crc kubenswrapper[4837]: I0313 12:15:52.107798 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-gmczg"] Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.079340 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2230cdcb-087e-4882-8aea-c5d850b711ac" path="/var/lib/kubelet/pods/2230cdcb-087e-4882-8aea-c5d850b711ac/volumes" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.080740 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e" path="/var/lib/kubelet/pods/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e/volumes" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.082212 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="737740b8-437c-4c6a-a16f-ac0afcf40b95" path="/var/lib/kubelet/pods/737740b8-437c-4c6a-a16f-ac0afcf40b95/volumes" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.083026 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" path="/var/lib/kubelet/pods/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66/volumes" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.083916 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5601ea4-ee81-4e2a-b370-268652332465" path="/var/lib/kubelet/pods/c5601ea4-ee81-4e2a-b370-268652332465/volumes" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.227461 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.386098 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-inventory\") pod \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\" (UID: \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\") " Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.386940 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6b22\" (UniqueName: \"kubernetes.io/projected/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-kube-api-access-g6b22\") pod \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\" (UID: \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\") " Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.387098 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-ssh-key-openstack-edpm-ipam\") pod \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\" (UID: \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\") " Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.392052 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-kube-api-access-g6b22" (OuterVolumeSpecName: "kube-api-access-g6b22") pod "121f6d1b-1277-4d68-8a48-6c4630dd6fe5" (UID: "121f6d1b-1277-4d68-8a48-6c4630dd6fe5"). InnerVolumeSpecName "kube-api-access-g6b22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.420048 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-inventory" (OuterVolumeSpecName: "inventory") pod "121f6d1b-1277-4d68-8a48-6c4630dd6fe5" (UID: "121f6d1b-1277-4d68-8a48-6c4630dd6fe5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.420442 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "121f6d1b-1277-4d68-8a48-6c4630dd6fe5" (UID: "121f6d1b-1277-4d68-8a48-6c4630dd6fe5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.488982 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6b22\" (UniqueName: \"kubernetes.io/projected/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-kube-api-access-g6b22\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.489015 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.489026 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.707237 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" event={"ID":"121f6d1b-1277-4d68-8a48-6c4630dd6fe5","Type":"ContainerDied","Data":"ef1d62890ff7c257d9b17342f3219ac6a0097e8282e72a43ca3f00b10ba0d794"} Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.707284 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef1d62890ff7c257d9b17342f3219ac6a0097e8282e72a43ca3f00b10ba0d794" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.707289 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.785037 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk"] Mar 13 12:15:53 crc kubenswrapper[4837]: E0313 12:15:53.785453 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" containerName="extract-utilities" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.785475 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" containerName="extract-utilities" Mar 13 12:15:53 crc kubenswrapper[4837]: E0313 12:15:53.785490 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" containerName="registry-server" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.785497 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" containerName="registry-server" Mar 13 12:15:53 crc kubenswrapper[4837]: E0313 12:15:53.785525 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" containerName="extract-content" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.785536 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" containerName="extract-content" Mar 13 12:15:53 crc kubenswrapper[4837]: E0313 12:15:53.785548 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="121f6d1b-1277-4d68-8a48-6c4630dd6fe5" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.785555 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="121f6d1b-1277-4d68-8a48-6c4630dd6fe5" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.785777 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="121f6d1b-1277-4d68-8a48-6c4630dd6fe5" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.785790 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" containerName="registry-server" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.786493 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.792006 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.792065 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.792145 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.792378 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.800297 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk"] Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.896219 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s95mk\" (UID: \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.896283 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s95mk\" (UID: \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.896311 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwrrn\" (UniqueName: \"kubernetes.io/projected/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-kube-api-access-vwrrn\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s95mk\" (UID: \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.998247 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s95mk\" (UID: \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.998339 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s95mk\" (UID: \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.998373 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwrrn\" (UniqueName: \"kubernetes.io/projected/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-kube-api-access-vwrrn\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s95mk\" (UID: \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:15:54 crc kubenswrapper[4837]: I0313 12:15:54.002420 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s95mk\" (UID: \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:15:54 crc kubenswrapper[4837]: I0313 12:15:54.004223 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s95mk\" (UID: \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:15:54 crc kubenswrapper[4837]: I0313 12:15:54.014831 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwrrn\" (UniqueName: \"kubernetes.io/projected/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-kube-api-access-vwrrn\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s95mk\" (UID: \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:15:54 crc kubenswrapper[4837]: I0313 12:15:54.108791 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:15:54 crc kubenswrapper[4837]: I0313 12:15:54.624111 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk"] Mar 13 12:15:54 crc kubenswrapper[4837]: I0313 12:15:54.717460 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" event={"ID":"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4","Type":"ContainerStarted","Data":"0b26b64e22cc52cf65f26e43050b957ac334bc911e43a00d74e320b57110468c"} Mar 13 12:15:55 crc kubenswrapper[4837]: I0313 12:15:55.056549 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:15:55 crc kubenswrapper[4837]: E0313 12:15:55.056829 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:15:55 crc kubenswrapper[4837]: I0313 12:15:55.728784 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" event={"ID":"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4","Type":"ContainerStarted","Data":"3519a322b7f03b5bd477d8dd194033af493b40ab4c32f95974aae213419d2bf1"} Mar 13 12:15:55 crc kubenswrapper[4837]: I0313 12:15:55.746181 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" podStartSLOduration=2.282869183 podStartE2EDuration="2.746164508s" podCreationTimestamp="2026-03-13 12:15:53 +0000 UTC" firstStartedPulling="2026-03-13 12:15:54.628409039 +0000 UTC m=+1670.266675802" lastFinishedPulling="2026-03-13 12:15:55.091704364 +0000 UTC m=+1670.729971127" observedRunningTime="2026-03-13 12:15:55.744597269 +0000 UTC m=+1671.382864042" watchObservedRunningTime="2026-03-13 12:15:55.746164508 +0000 UTC m=+1671.384431271" Mar 13 12:15:55 crc kubenswrapper[4837]: I0313 12:15:55.769866 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:55 crc kubenswrapper[4837]: I0313 12:15:55.772052 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:55 crc kubenswrapper[4837]: I0313 12:15:55.820837 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:56 crc kubenswrapper[4837]: I0313 12:15:56.789593 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:56 crc kubenswrapper[4837]: I0313 12:15:56.845342 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-66kqz"] Mar 13 12:15:58 crc kubenswrapper[4837]: I0313 12:15:58.751360 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-66kqz" podUID="39642113-74ee-406e-9ffa-5b1f8a86f0a3" containerName="registry-server" containerID="cri-o://89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb" gracePeriod=2 Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.037423 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-zgdc9"] Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.061890 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-zgdc9"] Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.217714 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.306868 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39642113-74ee-406e-9ffa-5b1f8a86f0a3-catalog-content\") pod \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\" (UID: \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\") " Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.307353 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c96br\" (UniqueName: \"kubernetes.io/projected/39642113-74ee-406e-9ffa-5b1f8a86f0a3-kube-api-access-c96br\") pod \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\" (UID: \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\") " Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.307602 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39642113-74ee-406e-9ffa-5b1f8a86f0a3-utilities\") pod \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\" (UID: \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\") " Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.308557 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39642113-74ee-406e-9ffa-5b1f8a86f0a3-utilities" (OuterVolumeSpecName: "utilities") pod "39642113-74ee-406e-9ffa-5b1f8a86f0a3" (UID: "39642113-74ee-406e-9ffa-5b1f8a86f0a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.316392 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39642113-74ee-406e-9ffa-5b1f8a86f0a3-kube-api-access-c96br" (OuterVolumeSpecName: "kube-api-access-c96br") pod "39642113-74ee-406e-9ffa-5b1f8a86f0a3" (UID: "39642113-74ee-406e-9ffa-5b1f8a86f0a3"). InnerVolumeSpecName "kube-api-access-c96br". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.335660 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39642113-74ee-406e-9ffa-5b1f8a86f0a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "39642113-74ee-406e-9ffa-5b1f8a86f0a3" (UID: "39642113-74ee-406e-9ffa-5b1f8a86f0a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.409853 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39642113-74ee-406e-9ffa-5b1f8a86f0a3-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.409904 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39642113-74ee-406e-9ffa-5b1f8a86f0a3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.409922 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c96br\" (UniqueName: \"kubernetes.io/projected/39642113-74ee-406e-9ffa-5b1f8a86f0a3-kube-api-access-c96br\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.768449 4837 generic.go:334] "Generic (PLEG): container finished" podID="39642113-74ee-406e-9ffa-5b1f8a86f0a3" containerID="89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb" exitCode=0 Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.768508 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66kqz" event={"ID":"39642113-74ee-406e-9ffa-5b1f8a86f0a3","Type":"ContainerDied","Data":"89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb"} Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.768542 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66kqz" event={"ID":"39642113-74ee-406e-9ffa-5b1f8a86f0a3","Type":"ContainerDied","Data":"65f583c268758f75b55fc8d8b1b7841f85ff3b2eb56e34fd425055fd65c76ae8"} Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.768580 4837 scope.go:117] "RemoveContainer" containerID="89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.768847 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.815501 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-66kqz"] Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.817572 4837 scope.go:117] "RemoveContainer" containerID="90f4bdf2a9038cc5810f20ee433a0ea49eba585b46346568ad1d765738079425" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.826743 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-66kqz"] Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.853968 4837 scope.go:117] "RemoveContainer" containerID="376b0d55207197a286fb352a3a99bb1a2ae96c88906c793388f162a064b37a64" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.879552 4837 scope.go:117] "RemoveContainer" containerID="89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb" Mar 13 12:15:59 crc kubenswrapper[4837]: E0313 12:15:59.880109 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb\": container with ID starting with 89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb not found: ID does not exist" containerID="89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.880153 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb"} err="failed to get container status \"89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb\": rpc error: code = NotFound desc = could not find container \"89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb\": container with ID starting with 89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb not found: ID does not exist" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.880184 4837 scope.go:117] "RemoveContainer" containerID="90f4bdf2a9038cc5810f20ee433a0ea49eba585b46346568ad1d765738079425" Mar 13 12:15:59 crc kubenswrapper[4837]: E0313 12:15:59.880616 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90f4bdf2a9038cc5810f20ee433a0ea49eba585b46346568ad1d765738079425\": container with ID starting with 90f4bdf2a9038cc5810f20ee433a0ea49eba585b46346568ad1d765738079425 not found: ID does not exist" containerID="90f4bdf2a9038cc5810f20ee433a0ea49eba585b46346568ad1d765738079425" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.880669 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90f4bdf2a9038cc5810f20ee433a0ea49eba585b46346568ad1d765738079425"} err="failed to get container status \"90f4bdf2a9038cc5810f20ee433a0ea49eba585b46346568ad1d765738079425\": rpc error: code = NotFound desc = could not find container \"90f4bdf2a9038cc5810f20ee433a0ea49eba585b46346568ad1d765738079425\": container with ID starting with 90f4bdf2a9038cc5810f20ee433a0ea49eba585b46346568ad1d765738079425 not found: ID does not exist" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.880696 4837 scope.go:117] "RemoveContainer" containerID="376b0d55207197a286fb352a3a99bb1a2ae96c88906c793388f162a064b37a64" Mar 13 12:15:59 crc kubenswrapper[4837]: E0313 12:15:59.881031 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"376b0d55207197a286fb352a3a99bb1a2ae96c88906c793388f162a064b37a64\": container with ID starting with 376b0d55207197a286fb352a3a99bb1a2ae96c88906c793388f162a064b37a64 not found: ID does not exist" containerID="376b0d55207197a286fb352a3a99bb1a2ae96c88906c793388f162a064b37a64" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.881060 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"376b0d55207197a286fb352a3a99bb1a2ae96c88906c793388f162a064b37a64"} err="failed to get container status \"376b0d55207197a286fb352a3a99bb1a2ae96c88906c793388f162a064b37a64\": rpc error: code = NotFound desc = could not find container \"376b0d55207197a286fb352a3a99bb1a2ae96c88906c793388f162a064b37a64\": container with ID starting with 376b0d55207197a286fb352a3a99bb1a2ae96c88906c793388f162a064b37a64 not found: ID does not exist" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.138343 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556736-26kwx"] Mar 13 12:16:00 crc kubenswrapper[4837]: E0313 12:16:00.138816 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39642113-74ee-406e-9ffa-5b1f8a86f0a3" containerName="extract-content" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.138837 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="39642113-74ee-406e-9ffa-5b1f8a86f0a3" containerName="extract-content" Mar 13 12:16:00 crc kubenswrapper[4837]: E0313 12:16:00.138862 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39642113-74ee-406e-9ffa-5b1f8a86f0a3" containerName="extract-utilities" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.138871 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="39642113-74ee-406e-9ffa-5b1f8a86f0a3" containerName="extract-utilities" Mar 13 12:16:00 crc kubenswrapper[4837]: E0313 12:16:00.138883 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39642113-74ee-406e-9ffa-5b1f8a86f0a3" containerName="registry-server" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.138891 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="39642113-74ee-406e-9ffa-5b1f8a86f0a3" containerName="registry-server" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.139066 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="39642113-74ee-406e-9ffa-5b1f8a86f0a3" containerName="registry-server" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.139695 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556736-26kwx" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.141966 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.142026 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.142153 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.148398 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556736-26kwx"] Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.226704 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9gzw\" (UniqueName: \"kubernetes.io/projected/a2c1518c-d031-4597-ab77-8626e068bcda-kube-api-access-n9gzw\") pod \"auto-csr-approver-29556736-26kwx\" (UID: \"a2c1518c-d031-4597-ab77-8626e068bcda\") " pod="openshift-infra/auto-csr-approver-29556736-26kwx" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.329134 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9gzw\" (UniqueName: \"kubernetes.io/projected/a2c1518c-d031-4597-ab77-8626e068bcda-kube-api-access-n9gzw\") pod \"auto-csr-approver-29556736-26kwx\" (UID: \"a2c1518c-d031-4597-ab77-8626e068bcda\") " pod="openshift-infra/auto-csr-approver-29556736-26kwx" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.345499 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9gzw\" (UniqueName: \"kubernetes.io/projected/a2c1518c-d031-4597-ab77-8626e068bcda-kube-api-access-n9gzw\") pod \"auto-csr-approver-29556736-26kwx\" (UID: \"a2c1518c-d031-4597-ab77-8626e068bcda\") " pod="openshift-infra/auto-csr-approver-29556736-26kwx" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.458615 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556736-26kwx" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.892177 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556736-26kwx"] Mar 13 12:16:01 crc kubenswrapper[4837]: I0313 12:16:01.062049 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f" path="/var/lib/kubelet/pods/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f/volumes" Mar 13 12:16:01 crc kubenswrapper[4837]: I0313 12:16:01.062814 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39642113-74ee-406e-9ffa-5b1f8a86f0a3" path="/var/lib/kubelet/pods/39642113-74ee-406e-9ffa-5b1f8a86f0a3/volumes" Mar 13 12:16:01 crc kubenswrapper[4837]: I0313 12:16:01.790399 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556736-26kwx" event={"ID":"a2c1518c-d031-4597-ab77-8626e068bcda","Type":"ContainerStarted","Data":"00518416b47b957ef8c5e7a34d278e0fa923687f5d90d57d3fd691ac9771deeb"} Mar 13 12:16:02 crc kubenswrapper[4837]: I0313 12:16:02.799676 4837 generic.go:334] "Generic (PLEG): container finished" podID="a2c1518c-d031-4597-ab77-8626e068bcda" containerID="eab36df7c6a9acf9dc7560368f9674c4b5510068e382ff493b327a540b10eb38" exitCode=0 Mar 13 12:16:02 crc kubenswrapper[4837]: I0313 12:16:02.799761 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556736-26kwx" event={"ID":"a2c1518c-d031-4597-ab77-8626e068bcda","Type":"ContainerDied","Data":"eab36df7c6a9acf9dc7560368f9674c4b5510068e382ff493b327a540b10eb38"} Mar 13 12:16:04 crc kubenswrapper[4837]: I0313 12:16:04.193472 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556736-26kwx" Mar 13 12:16:04 crc kubenswrapper[4837]: I0313 12:16:04.323713 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9gzw\" (UniqueName: \"kubernetes.io/projected/a2c1518c-d031-4597-ab77-8626e068bcda-kube-api-access-n9gzw\") pod \"a2c1518c-d031-4597-ab77-8626e068bcda\" (UID: \"a2c1518c-d031-4597-ab77-8626e068bcda\") " Mar 13 12:16:04 crc kubenswrapper[4837]: I0313 12:16:04.332005 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2c1518c-d031-4597-ab77-8626e068bcda-kube-api-access-n9gzw" (OuterVolumeSpecName: "kube-api-access-n9gzw") pod "a2c1518c-d031-4597-ab77-8626e068bcda" (UID: "a2c1518c-d031-4597-ab77-8626e068bcda"). InnerVolumeSpecName "kube-api-access-n9gzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:16:04 crc kubenswrapper[4837]: I0313 12:16:04.425425 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9gzw\" (UniqueName: \"kubernetes.io/projected/a2c1518c-d031-4597-ab77-8626e068bcda-kube-api-access-n9gzw\") on node \"crc\" DevicePath \"\"" Mar 13 12:16:04 crc kubenswrapper[4837]: I0313 12:16:04.829491 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556736-26kwx" event={"ID":"a2c1518c-d031-4597-ab77-8626e068bcda","Type":"ContainerDied","Data":"00518416b47b957ef8c5e7a34d278e0fa923687f5d90d57d3fd691ac9771deeb"} Mar 13 12:16:04 crc kubenswrapper[4837]: I0313 12:16:04.829532 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00518416b47b957ef8c5e7a34d278e0fa923687f5d90d57d3fd691ac9771deeb" Mar 13 12:16:04 crc kubenswrapper[4837]: I0313 12:16:04.829568 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556736-26kwx" Mar 13 12:16:05 crc kubenswrapper[4837]: I0313 12:16:05.263777 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556730-jvprz"] Mar 13 12:16:05 crc kubenswrapper[4837]: I0313 12:16:05.279418 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556730-jvprz"] Mar 13 12:16:07 crc kubenswrapper[4837]: I0313 12:16:07.078466 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="348878ea-aa9f-4306-af10-6a56583447a4" path="/var/lib/kubelet/pods/348878ea-aa9f-4306-af10-6a56583447a4/volumes" Mar 13 12:16:09 crc kubenswrapper[4837]: I0313 12:16:09.050992 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:16:09 crc kubenswrapper[4837]: E0313 12:16:09.052173 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:16:18 crc kubenswrapper[4837]: I0313 12:16:18.050344 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-mbps4"] Mar 13 12:16:18 crc kubenswrapper[4837]: I0313 12:16:18.062692 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-mbps4"] Mar 13 12:16:19 crc kubenswrapper[4837]: I0313 12:16:19.059332 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="685f13a4-d293-4199-8049-67b02c0162c1" path="/var/lib/kubelet/pods/685f13a4-d293-4199-8049-67b02c0162c1/volumes" Mar 13 12:16:22 crc kubenswrapper[4837]: I0313 12:16:22.043741 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-45j5g"] Mar 13 12:16:22 crc kubenswrapper[4837]: I0313 12:16:22.054023 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-g24hg"] Mar 13 12:16:22 crc kubenswrapper[4837]: I0313 12:16:22.062010 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6b07-account-create-update-wxqsd"] Mar 13 12:16:22 crc kubenswrapper[4837]: I0313 12:16:22.074277 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-330b-account-create-update-snkff"] Mar 13 12:16:22 crc kubenswrapper[4837]: I0313 12:16:22.084521 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-g24hg"] Mar 13 12:16:22 crc kubenswrapper[4837]: I0313 12:16:22.094710 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-6b07-account-create-update-wxqsd"] Mar 13 12:16:22 crc kubenswrapper[4837]: I0313 12:16:22.104564 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-45j5g"] Mar 13 12:16:22 crc kubenswrapper[4837]: I0313 12:16:22.113982 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-9a59-account-create-update-hqxzk"] Mar 13 12:16:22 crc kubenswrapper[4837]: I0313 12:16:22.121878 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-330b-account-create-update-snkff"] Mar 13 12:16:22 crc kubenswrapper[4837]: I0313 12:16:22.131810 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-9a59-account-create-update-hqxzk"] Mar 13 12:16:23 crc kubenswrapper[4837]: I0313 12:16:23.064769 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72768daf-a5fa-4c8e-b9c3-49cd5f87fe30" path="/var/lib/kubelet/pods/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30/volumes" Mar 13 12:16:23 crc kubenswrapper[4837]: I0313 12:16:23.065589 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77cef7b0-af86-456f-973b-923cb901b88d" path="/var/lib/kubelet/pods/77cef7b0-af86-456f-973b-923cb901b88d/volumes" Mar 13 12:16:23 crc kubenswrapper[4837]: I0313 12:16:23.066145 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a78456e1-6f14-45d4-ab3f-1fea88af4749" path="/var/lib/kubelet/pods/a78456e1-6f14-45d4-ab3f-1fea88af4749/volumes" Mar 13 12:16:23 crc kubenswrapper[4837]: I0313 12:16:23.066747 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6b37e8b-50ec-402e-ae31-27ff0d84e0be" path="/var/lib/kubelet/pods/e6b37e8b-50ec-402e-ae31-27ff0d84e0be/volumes" Mar 13 12:16:23 crc kubenswrapper[4837]: I0313 12:16:23.067970 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f029b52a-1a09-44b3-affe-9449cd6a5944" path="/var/lib/kubelet/pods/f029b52a-1a09-44b3-affe-9449cd6a5944/volumes" Mar 13 12:16:24 crc kubenswrapper[4837]: I0313 12:16:24.047938 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:16:24 crc kubenswrapper[4837]: E0313 12:16:24.048466 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:16:26 crc kubenswrapper[4837]: I0313 12:16:26.035184 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-2dlt8"] Mar 13 12:16:26 crc kubenswrapper[4837]: I0313 12:16:26.043763 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-2dlt8"] Mar 13 12:16:27 crc kubenswrapper[4837]: I0313 12:16:27.064078 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe" path="/var/lib/kubelet/pods/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe/volumes" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.235171 4837 scope.go:117] "RemoveContainer" containerID="258afa8ad3c4b205a4d5ebbc2dad025a8beb1c8bcd26054b8547c8dad13f8f6c" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.256908 4837 scope.go:117] "RemoveContainer" containerID="fbb8d3067503d33b0b6e6a915789395c7b9c10818b3ce84f4506b15f77d6207f" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.300461 4837 scope.go:117] "RemoveContainer" containerID="4ef4f42482f9efbb7e95ba0aa3a8a4567cffbb3946a12623724cae5ed211d4e1" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.340397 4837 scope.go:117] "RemoveContainer" containerID="4f6fb24113c34cb08d7bf34817309c7c27eeab0cdaee4f12683e138394d254b1" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.388049 4837 scope.go:117] "RemoveContainer" containerID="40deea41e769b1017207ec620ac05bd1eeae7028c9b2f3cacb4bc02a7f4fffdf" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.438064 4837 scope.go:117] "RemoveContainer" containerID="248bbf02c11ba4d4459897916fec2f24105abad663f25d012f6888d993c3fbac" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.481438 4837 scope.go:117] "RemoveContainer" containerID="400f25fc20473b4a0989af2562c9f1940f8ca26a8e2532da0bcde1d8c359bf39" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.526622 4837 scope.go:117] "RemoveContainer" containerID="4278a43d1836aa1abbebaa7d3b0197dd5fc3373adc2b4d3124d2a223104eef56" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.579324 4837 scope.go:117] "RemoveContainer" containerID="42bfa52d5c8c4ce4aaf6212f222930fd5d442e727a1a7b492df691e11a1e81f6" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.617356 4837 scope.go:117] "RemoveContainer" containerID="8295d45762eef27ce4120c578b478e84691da779f8c9457d397485b5b46c5eba" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.666443 4837 scope.go:117] "RemoveContainer" containerID="e460ab529bcbaef415dda78934a987cdd80d8b23f4cad796d19dcd468ce2d5f7" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.691013 4837 scope.go:117] "RemoveContainer" containerID="d01d04b228faf7f13c332e53f55aacbde9b692f0da2cccf686b1a57f52fa8fe2" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.709261 4837 scope.go:117] "RemoveContainer" containerID="286a6a1365f30df6b40943e24ec3066d64b002e22ec98bea016b42eeee5b1160" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.726182 4837 scope.go:117] "RemoveContainer" containerID="7c2129e0048255a871372a3d7023ed828ca0d6f1f4e610da012f5353ff07c822" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.743359 4837 scope.go:117] "RemoveContainer" containerID="9c444d34c403a2440618afe6e0c75ef9551c465f012f8ba4f50c5bde9744bb16" Mar 13 12:16:39 crc kubenswrapper[4837]: I0313 12:16:39.048305 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:16:39 crc kubenswrapper[4837]: E0313 12:16:39.049964 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:16:52 crc kubenswrapper[4837]: I0313 12:16:52.046105 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-jkthw"] Mar 13 12:16:52 crc kubenswrapper[4837]: I0313 12:16:52.059455 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-jkthw"] Mar 13 12:16:52 crc kubenswrapper[4837]: I0313 12:16:52.242611 4837 generic.go:334] "Generic (PLEG): container finished" podID="875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4" containerID="3519a322b7f03b5bd477d8dd194033af493b40ab4c32f95974aae213419d2bf1" exitCode=0 Mar 13 12:16:52 crc kubenswrapper[4837]: I0313 12:16:52.242667 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" event={"ID":"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4","Type":"ContainerDied","Data":"3519a322b7f03b5bd477d8dd194033af493b40ab4c32f95974aae213419d2bf1"} Mar 13 12:16:53 crc kubenswrapper[4837]: I0313 12:16:53.072973 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4490fb3-45d7-4b40-ad34-5bf33ba88491" path="/var/lib/kubelet/pods/b4490fb3-45d7-4b40-ad34-5bf33ba88491/volumes" Mar 13 12:16:53 crc kubenswrapper[4837]: I0313 12:16:53.664765 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:16:53 crc kubenswrapper[4837]: I0313 12:16:53.730357 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwrrn\" (UniqueName: \"kubernetes.io/projected/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-kube-api-access-vwrrn\") pod \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\" (UID: \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\") " Mar 13 12:16:53 crc kubenswrapper[4837]: I0313 12:16:53.730456 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-ssh-key-openstack-edpm-ipam\") pod \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\" (UID: \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\") " Mar 13 12:16:53 crc kubenswrapper[4837]: I0313 12:16:53.730555 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-inventory\") pod \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\" (UID: \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\") " Mar 13 12:16:53 crc kubenswrapper[4837]: I0313 12:16:53.749446 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-kube-api-access-vwrrn" (OuterVolumeSpecName: "kube-api-access-vwrrn") pod "875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4" (UID: "875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4"). InnerVolumeSpecName "kube-api-access-vwrrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:16:53 crc kubenswrapper[4837]: I0313 12:16:53.757751 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4" (UID: "875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:16:53 crc kubenswrapper[4837]: I0313 12:16:53.757764 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-inventory" (OuterVolumeSpecName: "inventory") pod "875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4" (UID: "875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:16:53 crc kubenswrapper[4837]: I0313 12:16:53.832885 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwrrn\" (UniqueName: \"kubernetes.io/projected/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-kube-api-access-vwrrn\") on node \"crc\" DevicePath \"\"" Mar 13 12:16:53 crc kubenswrapper[4837]: I0313 12:16:53.832918 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:16:53 crc kubenswrapper[4837]: I0313 12:16:53.832931 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.034226 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-wdwg2"] Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.044018 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-wdwg2"] Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.048790 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:16:54 crc kubenswrapper[4837]: E0313 12:16:54.049056 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.259494 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" event={"ID":"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4","Type":"ContainerDied","Data":"0b26b64e22cc52cf65f26e43050b957ac334bc911e43a00d74e320b57110468c"} Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.259533 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b26b64e22cc52cf65f26e43050b957ac334bc911e43a00d74e320b57110468c" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.259543 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.338038 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8"] Mar 13 12:16:54 crc kubenswrapper[4837]: E0313 12:16:54.338513 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.338530 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 13 12:16:54 crc kubenswrapper[4837]: E0313 12:16:54.338554 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2c1518c-d031-4597-ab77-8626e068bcda" containerName="oc" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.338561 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2c1518c-d031-4597-ab77-8626e068bcda" containerName="oc" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.338763 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2c1518c-d031-4597-ab77-8626e068bcda" containerName="oc" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.338793 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.339396 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.342462 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.342587 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.342749 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.342749 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.358418 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8"] Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.442479 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh6vg\" (UniqueName: \"kubernetes.io/projected/e3ec33da-9091-4eb1-aafa-62b9bdf16072-kube-api-access-bh6vg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42br8\" (UID: \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.442811 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3ec33da-9091-4eb1-aafa-62b9bdf16072-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42br8\" (UID: \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.442866 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3ec33da-9091-4eb1-aafa-62b9bdf16072-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42br8\" (UID: \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.544621 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh6vg\" (UniqueName: \"kubernetes.io/projected/e3ec33da-9091-4eb1-aafa-62b9bdf16072-kube-api-access-bh6vg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42br8\" (UID: \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.544798 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3ec33da-9091-4eb1-aafa-62b9bdf16072-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42br8\" (UID: \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.544870 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3ec33da-9091-4eb1-aafa-62b9bdf16072-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42br8\" (UID: \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.550158 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3ec33da-9091-4eb1-aafa-62b9bdf16072-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42br8\" (UID: \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.550400 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3ec33da-9091-4eb1-aafa-62b9bdf16072-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42br8\" (UID: \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.563742 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh6vg\" (UniqueName: \"kubernetes.io/projected/e3ec33da-9091-4eb1-aafa-62b9bdf16072-kube-api-access-bh6vg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42br8\" (UID: \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.664740 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:16:55 crc kubenswrapper[4837]: I0313 12:16:55.059448 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2d0a770-288f-40d8-832e-f5463863bef1" path="/var/lib/kubelet/pods/d2d0a770-288f-40d8-832e-f5463863bef1/volumes" Mar 13 12:16:55 crc kubenswrapper[4837]: I0313 12:16:55.191799 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8"] Mar 13 12:16:55 crc kubenswrapper[4837]: I0313 12:16:55.267277 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" event={"ID":"e3ec33da-9091-4eb1-aafa-62b9bdf16072","Type":"ContainerStarted","Data":"7e407174b52c46f6e2f99460d322d25ed15b3491c843625b34f53f2a1491b8e6"} Mar 13 12:16:56 crc kubenswrapper[4837]: I0313 12:16:56.284135 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" event={"ID":"e3ec33da-9091-4eb1-aafa-62b9bdf16072","Type":"ContainerStarted","Data":"fdf1000310c4e8eefb8a2b2cf15e340f600507dfb833c98fdddfad4aa86a3d48"} Mar 13 12:17:00 crc kubenswrapper[4837]: I0313 12:17:00.317123 4837 generic.go:334] "Generic (PLEG): container finished" podID="e3ec33da-9091-4eb1-aafa-62b9bdf16072" containerID="fdf1000310c4e8eefb8a2b2cf15e340f600507dfb833c98fdddfad4aa86a3d48" exitCode=0 Mar 13 12:17:00 crc kubenswrapper[4837]: I0313 12:17:00.317589 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" event={"ID":"e3ec33da-9091-4eb1-aafa-62b9bdf16072","Type":"ContainerDied","Data":"fdf1000310c4e8eefb8a2b2cf15e340f600507dfb833c98fdddfad4aa86a3d48"} Mar 13 12:17:01 crc kubenswrapper[4837]: I0313 12:17:01.744927 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:17:01 crc kubenswrapper[4837]: I0313 12:17:01.802817 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh6vg\" (UniqueName: \"kubernetes.io/projected/e3ec33da-9091-4eb1-aafa-62b9bdf16072-kube-api-access-bh6vg\") pod \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\" (UID: \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\") " Mar 13 12:17:01 crc kubenswrapper[4837]: I0313 12:17:01.803023 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3ec33da-9091-4eb1-aafa-62b9bdf16072-inventory\") pod \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\" (UID: \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\") " Mar 13 12:17:01 crc kubenswrapper[4837]: I0313 12:17:01.803154 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3ec33da-9091-4eb1-aafa-62b9bdf16072-ssh-key-openstack-edpm-ipam\") pod \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\" (UID: \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\") " Mar 13 12:17:01 crc kubenswrapper[4837]: I0313 12:17:01.810907 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3ec33da-9091-4eb1-aafa-62b9bdf16072-kube-api-access-bh6vg" (OuterVolumeSpecName: "kube-api-access-bh6vg") pod "e3ec33da-9091-4eb1-aafa-62b9bdf16072" (UID: "e3ec33da-9091-4eb1-aafa-62b9bdf16072"). InnerVolumeSpecName "kube-api-access-bh6vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:17:01 crc kubenswrapper[4837]: I0313 12:17:01.835624 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ec33da-9091-4eb1-aafa-62b9bdf16072-inventory" (OuterVolumeSpecName: "inventory") pod "e3ec33da-9091-4eb1-aafa-62b9bdf16072" (UID: "e3ec33da-9091-4eb1-aafa-62b9bdf16072"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:17:01 crc kubenswrapper[4837]: I0313 12:17:01.844289 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ec33da-9091-4eb1-aafa-62b9bdf16072-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e3ec33da-9091-4eb1-aafa-62b9bdf16072" (UID: "e3ec33da-9091-4eb1-aafa-62b9bdf16072"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:17:01 crc kubenswrapper[4837]: I0313 12:17:01.905828 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3ec33da-9091-4eb1-aafa-62b9bdf16072-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:17:01 crc kubenswrapper[4837]: I0313 12:17:01.905874 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3ec33da-9091-4eb1-aafa-62b9bdf16072-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:17:01 crc kubenswrapper[4837]: I0313 12:17:01.905889 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bh6vg\" (UniqueName: \"kubernetes.io/projected/e3ec33da-9091-4eb1-aafa-62b9bdf16072-kube-api-access-bh6vg\") on node \"crc\" DevicePath \"\"" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.337469 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" event={"ID":"e3ec33da-9091-4eb1-aafa-62b9bdf16072","Type":"ContainerDied","Data":"7e407174b52c46f6e2f99460d322d25ed15b3491c843625b34f53f2a1491b8e6"} Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.337878 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e407174b52c46f6e2f99460d322d25ed15b3491c843625b34f53f2a1491b8e6" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.337528 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.404919 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q"] Mar 13 12:17:02 crc kubenswrapper[4837]: E0313 12:17:02.405377 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ec33da-9091-4eb1-aafa-62b9bdf16072" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.405393 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ec33da-9091-4eb1-aafa-62b9bdf16072" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.405577 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ec33da-9091-4eb1-aafa-62b9bdf16072" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.406217 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.408070 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.408233 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.408355 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.408536 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.414259 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q"] Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.517604 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9vmj\" (UniqueName: \"kubernetes.io/projected/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-kube-api-access-w9vmj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2b48q\" (UID: \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.517697 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2b48q\" (UID: \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.518030 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2b48q\" (UID: \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.620255 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9vmj\" (UniqueName: \"kubernetes.io/projected/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-kube-api-access-w9vmj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2b48q\" (UID: \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.620322 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2b48q\" (UID: \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.620491 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2b48q\" (UID: \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.624075 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2b48q\" (UID: \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.625193 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2b48q\" (UID: \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.644715 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9vmj\" (UniqueName: \"kubernetes.io/projected/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-kube-api-access-w9vmj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2b48q\" (UID: \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.723207 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:03 crc kubenswrapper[4837]: I0313 12:17:03.217958 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q"] Mar 13 12:17:03 crc kubenswrapper[4837]: I0313 12:17:03.348254 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" event={"ID":"033a02c2-cbe4-4676-ae46-f9b9b17a60fb","Type":"ContainerStarted","Data":"91032604b3d0fae03f8ede54a282bb2194bdb5ffb6e2e1270d8112c4c0b7f064"} Mar 13 12:17:04 crc kubenswrapper[4837]: I0313 12:17:04.357070 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" event={"ID":"033a02c2-cbe4-4676-ae46-f9b9b17a60fb","Type":"ContainerStarted","Data":"5fbab873202bfa50ec52c325b0665c702c9bbe3e9c1e6e487145d1a320c5bf54"} Mar 13 12:17:04 crc kubenswrapper[4837]: I0313 12:17:04.380393 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" podStartSLOduration=1.939548697 podStartE2EDuration="2.380371752s" podCreationTimestamp="2026-03-13 12:17:02 +0000 UTC" firstStartedPulling="2026-03-13 12:17:03.217141126 +0000 UTC m=+1738.855407889" lastFinishedPulling="2026-03-13 12:17:03.657964171 +0000 UTC m=+1739.296230944" observedRunningTime="2026-03-13 12:17:04.374360713 +0000 UTC m=+1740.012627496" watchObservedRunningTime="2026-03-13 12:17:04.380371752 +0000 UTC m=+1740.018638515" Mar 13 12:17:05 crc kubenswrapper[4837]: I0313 12:17:05.060715 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:17:05 crc kubenswrapper[4837]: E0313 12:17:05.060972 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:17:06 crc kubenswrapper[4837]: I0313 12:17:06.031357 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-s7m97"] Mar 13 12:17:06 crc kubenswrapper[4837]: I0313 12:17:06.039005 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-s7m97"] Mar 13 12:17:07 crc kubenswrapper[4837]: I0313 12:17:07.061263 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3af4ac68-a437-4be7-adab-1ef336f0cbda" path="/var/lib/kubelet/pods/3af4ac68-a437-4be7-adab-1ef336f0cbda/volumes" Mar 13 12:17:16 crc kubenswrapper[4837]: I0313 12:17:16.046699 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-8vx8g"] Mar 13 12:17:16 crc kubenswrapper[4837]: I0313 12:17:16.056445 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-8vx8g"] Mar 13 12:17:17 crc kubenswrapper[4837]: I0313 12:17:17.040987 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-b6qnm"] Mar 13 12:17:17 crc kubenswrapper[4837]: I0313 12:17:17.066415 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08c7b2a5-b0b8-433f-b55d-c64eaeea8b76" path="/var/lib/kubelet/pods/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76/volumes" Mar 13 12:17:17 crc kubenswrapper[4837]: I0313 12:17:17.067285 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-b6qnm"] Mar 13 12:17:18 crc kubenswrapper[4837]: I0313 12:17:18.037929 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-qdzjz"] Mar 13 12:17:18 crc kubenswrapper[4837]: I0313 12:17:18.045934 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-qdzjz"] Mar 13 12:17:19 crc kubenswrapper[4837]: I0313 12:17:19.048390 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:17:19 crc kubenswrapper[4837]: E0313 12:17:19.049010 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:17:19 crc kubenswrapper[4837]: I0313 12:17:19.062376 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95b808e7-674f-4592-af6e-f7c8682f6a17" path="/var/lib/kubelet/pods/95b808e7-674f-4592-af6e-f7c8682f6a17/volumes" Mar 13 12:17:19 crc kubenswrapper[4837]: I0313 12:17:19.063011 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a44db1d6-6da2-41a5-a37f-ffc602f0d55a" path="/var/lib/kubelet/pods/a44db1d6-6da2-41a5-a37f-ffc602f0d55a/volumes" Mar 13 12:17:29 crc kubenswrapper[4837]: I0313 12:17:29.031925 4837 scope.go:117] "RemoveContainer" containerID="117a085c3636a60886a4974e5b0fb9b17907bfbb02c0f28e14e88a6a4aada355" Mar 13 12:17:29 crc kubenswrapper[4837]: I0313 12:17:29.083599 4837 scope.go:117] "RemoveContainer" containerID="167d2264a85f4435f333e5de927afa95b020419521d018ef924666fe1959c6ff" Mar 13 12:17:29 crc kubenswrapper[4837]: I0313 12:17:29.146596 4837 scope.go:117] "RemoveContainer" containerID="483a91e4e8aeb62a4bc9d00fab2fa3f3452e90337b10ae7eb6d6d40d39b495c8" Mar 13 12:17:29 crc kubenswrapper[4837]: I0313 12:17:29.183517 4837 scope.go:117] "RemoveContainer" containerID="2d322ad3eeeb347ecc17c10b7e12064f45bbd098c57202ba37c2350f75cdbf0c" Mar 13 12:17:29 crc kubenswrapper[4837]: I0313 12:17:29.233252 4837 scope.go:117] "RemoveContainer" containerID="843cf40344096a3f0565478be09bc819697f7ebe87515db62c711cd361ef6ce2" Mar 13 12:17:29 crc kubenswrapper[4837]: I0313 12:17:29.270175 4837 scope.go:117] "RemoveContainer" containerID="ba3dda01a90b7b0d00508491184e90f099c7ae7bc849213376ebbc68b88ffd0f" Mar 13 12:17:32 crc kubenswrapper[4837]: I0313 12:17:32.048726 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:17:32 crc kubenswrapper[4837]: E0313 12:17:32.049524 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:17:36 crc kubenswrapper[4837]: I0313 12:17:36.634219 4837 generic.go:334] "Generic (PLEG): container finished" podID="033a02c2-cbe4-4676-ae46-f9b9b17a60fb" containerID="5fbab873202bfa50ec52c325b0665c702c9bbe3e9c1e6e487145d1a320c5bf54" exitCode=0 Mar 13 12:17:36 crc kubenswrapper[4837]: I0313 12:17:36.634315 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" event={"ID":"033a02c2-cbe4-4676-ae46-f9b9b17a60fb","Type":"ContainerDied","Data":"5fbab873202bfa50ec52c325b0665c702c9bbe3e9c1e6e487145d1a320c5bf54"} Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.029930 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.178766 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-ssh-key-openstack-edpm-ipam\") pod \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\" (UID: \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\") " Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.178857 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-inventory\") pod \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\" (UID: \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\") " Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.179102 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9vmj\" (UniqueName: \"kubernetes.io/projected/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-kube-api-access-w9vmj\") pod \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\" (UID: \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\") " Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.184125 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-kube-api-access-w9vmj" (OuterVolumeSpecName: "kube-api-access-w9vmj") pod "033a02c2-cbe4-4676-ae46-f9b9b17a60fb" (UID: "033a02c2-cbe4-4676-ae46-f9b9b17a60fb"). InnerVolumeSpecName "kube-api-access-w9vmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.207765 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-inventory" (OuterVolumeSpecName: "inventory") pod "033a02c2-cbe4-4676-ae46-f9b9b17a60fb" (UID: "033a02c2-cbe4-4676-ae46-f9b9b17a60fb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.213525 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "033a02c2-cbe4-4676-ae46-f9b9b17a60fb" (UID: "033a02c2-cbe4-4676-ae46-f9b9b17a60fb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.281422 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.281459 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.281472 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9vmj\" (UniqueName: \"kubernetes.io/projected/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-kube-api-access-w9vmj\") on node \"crc\" DevicePath \"\"" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.650469 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" event={"ID":"033a02c2-cbe4-4676-ae46-f9b9b17a60fb","Type":"ContainerDied","Data":"91032604b3d0fae03f8ede54a282bb2194bdb5ffb6e2e1270d8112c4c0b7f064"} Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.650765 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91032604b3d0fae03f8ede54a282bb2194bdb5ffb6e2e1270d8112c4c0b7f064" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.650519 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.736423 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp"] Mar 13 12:17:38 crc kubenswrapper[4837]: E0313 12:17:38.737398 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="033a02c2-cbe4-4676-ae46-f9b9b17a60fb" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.737423 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="033a02c2-cbe4-4676-ae46-f9b9b17a60fb" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.737630 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="033a02c2-cbe4-4676-ae46-f9b9b17a60fb" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.738442 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.740757 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.740954 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.742917 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.743880 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.752974 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp"] Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.893806 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.893884 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qps56\" (UniqueName: \"kubernetes.io/projected/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-kube-api-access-qps56\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.893935 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.996297 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.996377 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qps56\" (UniqueName: \"kubernetes.io/projected/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-kube-api-access-qps56\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.996432 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:17:39 crc kubenswrapper[4837]: I0313 12:17:39.001385 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:17:39 crc kubenswrapper[4837]: I0313 12:17:39.005285 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:17:39 crc kubenswrapper[4837]: I0313 12:17:39.013807 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qps56\" (UniqueName: \"kubernetes.io/projected/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-kube-api-access-qps56\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:17:39 crc kubenswrapper[4837]: I0313 12:17:39.068903 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:17:39 crc kubenswrapper[4837]: I0313 12:17:39.560479 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp"] Mar 13 12:17:39 crc kubenswrapper[4837]: I0313 12:17:39.660010 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" event={"ID":"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5","Type":"ContainerStarted","Data":"56bfbf56a4b06e90c9acfeb1c1dfdd56d79a35d972d01a9b46ddb65137df22e0"} Mar 13 12:17:40 crc kubenswrapper[4837]: I0313 12:17:40.668552 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" event={"ID":"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5","Type":"ContainerStarted","Data":"7e0a021bc48a043f4211761a5ed2921942e6c5d4b5e12cae51fb489e9f145645"} Mar 13 12:17:40 crc kubenswrapper[4837]: I0313 12:17:40.707964 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" podStartSLOduration=2.230347053 podStartE2EDuration="2.707937113s" podCreationTimestamp="2026-03-13 12:17:38 +0000 UTC" firstStartedPulling="2026-03-13 12:17:39.567406978 +0000 UTC m=+1775.205673741" lastFinishedPulling="2026-03-13 12:17:40.044997038 +0000 UTC m=+1775.683263801" observedRunningTime="2026-03-13 12:17:40.686774689 +0000 UTC m=+1776.325041452" watchObservedRunningTime="2026-03-13 12:17:40.707937113 +0000 UTC m=+1776.346203876" Mar 13 12:17:44 crc kubenswrapper[4837]: I0313 12:17:44.048610 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:17:44 crc kubenswrapper[4837]: E0313 12:17:44.049228 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:17:55 crc kubenswrapper[4837]: I0313 12:17:55.056219 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:17:55 crc kubenswrapper[4837]: E0313 12:17:55.057071 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:18:00 crc kubenswrapper[4837]: I0313 12:18:00.153388 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556738-trdfn"] Mar 13 12:18:00 crc kubenswrapper[4837]: I0313 12:18:00.155159 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556738-trdfn" Mar 13 12:18:00 crc kubenswrapper[4837]: I0313 12:18:00.157317 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:18:00 crc kubenswrapper[4837]: I0313 12:18:00.157401 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:18:00 crc kubenswrapper[4837]: I0313 12:18:00.158010 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:18:00 crc kubenswrapper[4837]: I0313 12:18:00.163367 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556738-trdfn"] Mar 13 12:18:00 crc kubenswrapper[4837]: I0313 12:18:00.308943 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78p67\" (UniqueName: \"kubernetes.io/projected/abf39778-b981-4807-916d-f62ff0a03ac9-kube-api-access-78p67\") pod \"auto-csr-approver-29556738-trdfn\" (UID: \"abf39778-b981-4807-916d-f62ff0a03ac9\") " pod="openshift-infra/auto-csr-approver-29556738-trdfn" Mar 13 12:18:00 crc kubenswrapper[4837]: I0313 12:18:00.411127 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78p67\" (UniqueName: \"kubernetes.io/projected/abf39778-b981-4807-916d-f62ff0a03ac9-kube-api-access-78p67\") pod \"auto-csr-approver-29556738-trdfn\" (UID: \"abf39778-b981-4807-916d-f62ff0a03ac9\") " pod="openshift-infra/auto-csr-approver-29556738-trdfn" Mar 13 12:18:00 crc kubenswrapper[4837]: I0313 12:18:00.432425 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78p67\" (UniqueName: \"kubernetes.io/projected/abf39778-b981-4807-916d-f62ff0a03ac9-kube-api-access-78p67\") pod \"auto-csr-approver-29556738-trdfn\" (UID: \"abf39778-b981-4807-916d-f62ff0a03ac9\") " pod="openshift-infra/auto-csr-approver-29556738-trdfn" Mar 13 12:18:00 crc kubenswrapper[4837]: I0313 12:18:00.482385 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556738-trdfn" Mar 13 12:18:00 crc kubenswrapper[4837]: I0313 12:18:00.916103 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556738-trdfn"] Mar 13 12:18:01 crc kubenswrapper[4837]: I0313 12:18:01.040819 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-78jtc"] Mar 13 12:18:01 crc kubenswrapper[4837]: I0313 12:18:01.058841 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-t8qk9"] Mar 13 12:18:01 crc kubenswrapper[4837]: I0313 12:18:01.058881 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-78jtc"] Mar 13 12:18:01 crc kubenswrapper[4837]: I0313 12:18:01.066000 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-mqgjq"] Mar 13 12:18:01 crc kubenswrapper[4837]: I0313 12:18:01.074093 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-t8qk9"] Mar 13 12:18:01 crc kubenswrapper[4837]: I0313 12:18:01.082338 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-mqgjq"] Mar 13 12:18:01 crc kubenswrapper[4837]: I0313 12:18:01.846449 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556738-trdfn" event={"ID":"abf39778-b981-4807-916d-f62ff0a03ac9","Type":"ContainerStarted","Data":"8892a5742b94a7e9f8652d254b030f77a3f8d616c87aead90b427a6d5a291a73"} Mar 13 12:18:02 crc kubenswrapper[4837]: I0313 12:18:02.051143 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-c124-account-create-update-8zqgg"] Mar 13 12:18:02 crc kubenswrapper[4837]: I0313 12:18:02.052955 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-c124-account-create-update-8zqgg"] Mar 13 12:18:02 crc kubenswrapper[4837]: I0313 12:18:02.856971 4837 generic.go:334] "Generic (PLEG): container finished" podID="abf39778-b981-4807-916d-f62ff0a03ac9" containerID="7e866ef5a9a2608fd8aa30e6d573f07172996e7b068a978cf3d3449b179bd748" exitCode=0 Mar 13 12:18:02 crc kubenswrapper[4837]: I0313 12:18:02.857093 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556738-trdfn" event={"ID":"abf39778-b981-4807-916d-f62ff0a03ac9","Type":"ContainerDied","Data":"7e866ef5a9a2608fd8aa30e6d573f07172996e7b068a978cf3d3449b179bd748"} Mar 13 12:18:03 crc kubenswrapper[4837]: I0313 12:18:03.030600 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-8886-account-create-update-ljcrw"] Mar 13 12:18:03 crc kubenswrapper[4837]: I0313 12:18:03.040902 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-8886-account-create-update-ljcrw"] Mar 13 12:18:03 crc kubenswrapper[4837]: I0313 12:18:03.061478 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ac843c1-9934-4711-aae6-7f6920596cb3" path="/var/lib/kubelet/pods/6ac843c1-9934-4711-aae6-7f6920596cb3/volumes" Mar 13 12:18:03 crc kubenswrapper[4837]: I0313 12:18:03.062296 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb" path="/var/lib/kubelet/pods/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb/volumes" Mar 13 12:18:03 crc kubenswrapper[4837]: I0313 12:18:03.062986 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e397db42-b505-4447-87a2-4c12ed412f28" path="/var/lib/kubelet/pods/e397db42-b505-4447-87a2-4c12ed412f28/volumes" Mar 13 12:18:03 crc kubenswrapper[4837]: I0313 12:18:03.063653 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e51457d7-9619-4179-8f01-de6ffe5ceb82" path="/var/lib/kubelet/pods/e51457d7-9619-4179-8f01-de6ffe5ceb82/volumes" Mar 13 12:18:03 crc kubenswrapper[4837]: I0313 12:18:03.065143 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff8550d6-aacb-4848-928d-b1581a66d499" path="/var/lib/kubelet/pods/ff8550d6-aacb-4848-928d-b1581a66d499/volumes" Mar 13 12:18:03 crc kubenswrapper[4837]: I0313 12:18:03.065714 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-4581-account-create-update-w6tc2"] Mar 13 12:18:03 crc kubenswrapper[4837]: I0313 12:18:03.065747 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-4581-account-create-update-w6tc2"] Mar 13 12:18:04 crc kubenswrapper[4837]: I0313 12:18:04.206010 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556738-trdfn" Mar 13 12:18:04 crc kubenswrapper[4837]: I0313 12:18:04.292282 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78p67\" (UniqueName: \"kubernetes.io/projected/abf39778-b981-4807-916d-f62ff0a03ac9-kube-api-access-78p67\") pod \"abf39778-b981-4807-916d-f62ff0a03ac9\" (UID: \"abf39778-b981-4807-916d-f62ff0a03ac9\") " Mar 13 12:18:04 crc kubenswrapper[4837]: I0313 12:18:04.297841 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abf39778-b981-4807-916d-f62ff0a03ac9-kube-api-access-78p67" (OuterVolumeSpecName: "kube-api-access-78p67") pod "abf39778-b981-4807-916d-f62ff0a03ac9" (UID: "abf39778-b981-4807-916d-f62ff0a03ac9"). InnerVolumeSpecName "kube-api-access-78p67". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:18:04 crc kubenswrapper[4837]: I0313 12:18:04.394130 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78p67\" (UniqueName: \"kubernetes.io/projected/abf39778-b981-4807-916d-f62ff0a03ac9-kube-api-access-78p67\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:04 crc kubenswrapper[4837]: I0313 12:18:04.876722 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556738-trdfn" event={"ID":"abf39778-b981-4807-916d-f62ff0a03ac9","Type":"ContainerDied","Data":"8892a5742b94a7e9f8652d254b030f77a3f8d616c87aead90b427a6d5a291a73"} Mar 13 12:18:04 crc kubenswrapper[4837]: I0313 12:18:04.876796 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8892a5742b94a7e9f8652d254b030f77a3f8d616c87aead90b427a6d5a291a73" Mar 13 12:18:04 crc kubenswrapper[4837]: I0313 12:18:04.876828 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556738-trdfn" Mar 13 12:18:05 crc kubenswrapper[4837]: I0313 12:18:05.060469 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec46ef58-a8e9-4354-b9a1-568535879964" path="/var/lib/kubelet/pods/ec46ef58-a8e9-4354-b9a1-568535879964/volumes" Mar 13 12:18:05 crc kubenswrapper[4837]: I0313 12:18:05.275589 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556732-84qfh"] Mar 13 12:18:05 crc kubenswrapper[4837]: I0313 12:18:05.297348 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556732-84qfh"] Mar 13 12:18:06 crc kubenswrapper[4837]: I0313 12:18:06.047739 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:18:06 crc kubenswrapper[4837]: E0313 12:18:06.048075 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:18:07 crc kubenswrapper[4837]: I0313 12:18:07.060791 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="564139cd-c95b-45c7-bf55-00c944313930" path="/var/lib/kubelet/pods/564139cd-c95b-45c7-bf55-00c944313930/volumes" Mar 13 12:18:21 crc kubenswrapper[4837]: I0313 12:18:21.051671 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:18:21 crc kubenswrapper[4837]: E0313 12:18:21.052394 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:18:22 crc kubenswrapper[4837]: I0313 12:18:22.038343 4837 generic.go:334] "Generic (PLEG): container finished" podID="0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5" containerID="7e0a021bc48a043f4211761a5ed2921942e6c5d4b5e12cae51fb489e9f145645" exitCode=0 Mar 13 12:18:22 crc kubenswrapper[4837]: I0313 12:18:22.038453 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" event={"ID":"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5","Type":"ContainerDied","Data":"7e0a021bc48a043f4211761a5ed2921942e6c5d4b5e12cae51fb489e9f145645"} Mar 13 12:18:23 crc kubenswrapper[4837]: I0313 12:18:23.468479 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:18:23 crc kubenswrapper[4837]: I0313 12:18:23.553664 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-ssh-key-openstack-edpm-ipam\") pod \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " Mar 13 12:18:23 crc kubenswrapper[4837]: I0313 12:18:23.553767 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-inventory\") pod \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " Mar 13 12:18:23 crc kubenswrapper[4837]: I0313 12:18:23.553823 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qps56\" (UniqueName: \"kubernetes.io/projected/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-kube-api-access-qps56\") pod \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " Mar 13 12:18:23 crc kubenswrapper[4837]: I0313 12:18:23.564880 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-kube-api-access-qps56" (OuterVolumeSpecName: "kube-api-access-qps56") pod "0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5" (UID: "0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5"). InnerVolumeSpecName "kube-api-access-qps56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:18:23 crc kubenswrapper[4837]: E0313 12:18:23.581132 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-ssh-key-openstack-edpm-ipam podName:0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5 nodeName:}" failed. No retries permitted until 2026-03-13 12:18:24.081097516 +0000 UTC m=+1819.719364279 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key-openstack-edpm-ipam" (UniqueName: "kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-ssh-key-openstack-edpm-ipam") pod "0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5" (UID: "0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5") : error deleting /var/lib/kubelet/pods/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5/volume-subpaths: remove /var/lib/kubelet/pods/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5/volume-subpaths: no such file or directory Mar 13 12:18:23 crc kubenswrapper[4837]: I0313 12:18:23.583919 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-inventory" (OuterVolumeSpecName: "inventory") pod "0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5" (UID: "0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:18:23 crc kubenswrapper[4837]: I0313 12:18:23.656075 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:23 crc kubenswrapper[4837]: I0313 12:18:23.656129 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qps56\" (UniqueName: \"kubernetes.io/projected/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-kube-api-access-qps56\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.060559 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" event={"ID":"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5","Type":"ContainerDied","Data":"56bfbf56a4b06e90c9acfeb1c1dfdd56d79a35d972d01a9b46ddb65137df22e0"} Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.060605 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56bfbf56a4b06e90c9acfeb1c1dfdd56d79a35d972d01a9b46ddb65137df22e0" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.060625 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.143010 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-vjnpx"] Mar 13 12:18:24 crc kubenswrapper[4837]: E0313 12:18:24.143773 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.143792 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 13 12:18:24 crc kubenswrapper[4837]: E0313 12:18:24.143829 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abf39778-b981-4807-916d-f62ff0a03ac9" containerName="oc" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.143836 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="abf39778-b981-4807-916d-f62ff0a03ac9" containerName="oc" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.144056 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="abf39778-b981-4807-916d-f62ff0a03ac9" containerName="oc" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.144083 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.145073 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.153911 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-vjnpx"] Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.166604 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-ssh-key-openstack-edpm-ipam\") pod \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.171452 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5" (UID: "0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.269279 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4ddcb794-ab03-4308-a93c-c5929ed96e01-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-vjnpx\" (UID: \"4ddcb794-ab03-4308-a93c-c5929ed96e01\") " pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.269327 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ddcb794-ab03-4308-a93c-c5929ed96e01-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-vjnpx\" (UID: \"4ddcb794-ab03-4308-a93c-c5929ed96e01\") " pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.269410 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt4ls\" (UniqueName: \"kubernetes.io/projected/4ddcb794-ab03-4308-a93c-c5929ed96e01-kube-api-access-bt4ls\") pod \"ssh-known-hosts-edpm-deployment-vjnpx\" (UID: \"4ddcb794-ab03-4308-a93c-c5929ed96e01\") " pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.269489 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.371976 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4ddcb794-ab03-4308-a93c-c5929ed96e01-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-vjnpx\" (UID: \"4ddcb794-ab03-4308-a93c-c5929ed96e01\") " pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.372059 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ddcb794-ab03-4308-a93c-c5929ed96e01-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-vjnpx\" (UID: \"4ddcb794-ab03-4308-a93c-c5929ed96e01\") " pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.372325 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt4ls\" (UniqueName: \"kubernetes.io/projected/4ddcb794-ab03-4308-a93c-c5929ed96e01-kube-api-access-bt4ls\") pod \"ssh-known-hosts-edpm-deployment-vjnpx\" (UID: \"4ddcb794-ab03-4308-a93c-c5929ed96e01\") " pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.376182 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4ddcb794-ab03-4308-a93c-c5929ed96e01-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-vjnpx\" (UID: \"4ddcb794-ab03-4308-a93c-c5929ed96e01\") " pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.377243 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ddcb794-ab03-4308-a93c-c5929ed96e01-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-vjnpx\" (UID: \"4ddcb794-ab03-4308-a93c-c5929ed96e01\") " pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.391429 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt4ls\" (UniqueName: \"kubernetes.io/projected/4ddcb794-ab03-4308-a93c-c5929ed96e01-kube-api-access-bt4ls\") pod \"ssh-known-hosts-edpm-deployment-vjnpx\" (UID: \"4ddcb794-ab03-4308-a93c-c5929ed96e01\") " pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.474825 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.981432 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-vjnpx"] Mar 13 12:18:25 crc kubenswrapper[4837]: I0313 12:18:25.072316 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" event={"ID":"4ddcb794-ab03-4308-a93c-c5929ed96e01","Type":"ContainerStarted","Data":"451be746bf2063e51cbed79829c6492e0e00242a709c0fe8864eca7fe5d169bc"} Mar 13 12:18:26 crc kubenswrapper[4837]: I0313 12:18:26.080421 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" event={"ID":"4ddcb794-ab03-4308-a93c-c5929ed96e01","Type":"ContainerStarted","Data":"f83f1fa87b51b5557d732f083ed0b520911c865e056a38c97d7c668251609759"} Mar 13 12:18:26 crc kubenswrapper[4837]: I0313 12:18:26.101202 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" podStartSLOduration=1.70186305 podStartE2EDuration="2.101184749s" podCreationTimestamp="2026-03-13 12:18:24 +0000 UTC" firstStartedPulling="2026-03-13 12:18:24.993625011 +0000 UTC m=+1820.631891774" lastFinishedPulling="2026-03-13 12:18:25.39294671 +0000 UTC m=+1821.031213473" observedRunningTime="2026-03-13 12:18:26.09454251 +0000 UTC m=+1821.732809273" watchObservedRunningTime="2026-03-13 12:18:26.101184749 +0000 UTC m=+1821.739451512" Mar 13 12:18:29 crc kubenswrapper[4837]: I0313 12:18:29.391881 4837 scope.go:117] "RemoveContainer" containerID="e317d41369cc2f3ddf2e1c831d3041b43d32e03d72c05e27b06993576c33a0e8" Mar 13 12:18:29 crc kubenswrapper[4837]: I0313 12:18:29.460917 4837 scope.go:117] "RemoveContainer" containerID="c2cc081c6cf65b0ab460d8cc6143c9f0d5447d7db94e85de44cfe2121792b6a0" Mar 13 12:18:29 crc kubenswrapper[4837]: I0313 12:18:29.481874 4837 scope.go:117] "RemoveContainer" containerID="1c35974102ee9d500e8bf603751d70cec13d07eed47dcd00ec2798fe9d358807" Mar 13 12:18:29 crc kubenswrapper[4837]: I0313 12:18:29.534298 4837 scope.go:117] "RemoveContainer" containerID="5e6f4da7142b59c465f13069e8abffd32ebc3f04eeb6b88f772977ed584113c2" Mar 13 12:18:29 crc kubenswrapper[4837]: I0313 12:18:29.565947 4837 scope.go:117] "RemoveContainer" containerID="5d76ffad79a0d1339467174946f42bf027114aea75c47bb037057ca882b93f88" Mar 13 12:18:29 crc kubenswrapper[4837]: I0313 12:18:29.610571 4837 scope.go:117] "RemoveContainer" containerID="76d8bcdb73b13d595e4c37de91e0da9193b0dfe32e04f54fbcbfc723d4f95d1f" Mar 13 12:18:29 crc kubenswrapper[4837]: I0313 12:18:29.653176 4837 scope.go:117] "RemoveContainer" containerID="a98015db97ff0f5b37e30b833d1fc53c9a24f182fbe7bafcf011e2544e8dd80d" Mar 13 12:18:31 crc kubenswrapper[4837]: I0313 12:18:31.062138 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-f6gwd"] Mar 13 12:18:31 crc kubenswrapper[4837]: I0313 12:18:31.062431 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-f6gwd"] Mar 13 12:18:32 crc kubenswrapper[4837]: I0313 12:18:32.048716 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:18:32 crc kubenswrapper[4837]: E0313 12:18:32.049050 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:18:32 crc kubenswrapper[4837]: I0313 12:18:32.131089 4837 generic.go:334] "Generic (PLEG): container finished" podID="4ddcb794-ab03-4308-a93c-c5929ed96e01" containerID="f83f1fa87b51b5557d732f083ed0b520911c865e056a38c97d7c668251609759" exitCode=0 Mar 13 12:18:32 crc kubenswrapper[4837]: I0313 12:18:32.131140 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" event={"ID":"4ddcb794-ab03-4308-a93c-c5929ed96e01","Type":"ContainerDied","Data":"f83f1fa87b51b5557d732f083ed0b520911c865e056a38c97d7c668251609759"} Mar 13 12:18:33 crc kubenswrapper[4837]: I0313 12:18:33.062843 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d6d5bbe-7e5b-4645-95c4-af868cba3244" path="/var/lib/kubelet/pods/5d6d5bbe-7e5b-4645-95c4-af868cba3244/volumes" Mar 13 12:18:33 crc kubenswrapper[4837]: I0313 12:18:33.611397 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:33 crc kubenswrapper[4837]: I0313 12:18:33.640956 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ddcb794-ab03-4308-a93c-c5929ed96e01-ssh-key-openstack-edpm-ipam\") pod \"4ddcb794-ab03-4308-a93c-c5929ed96e01\" (UID: \"4ddcb794-ab03-4308-a93c-c5929ed96e01\") " Mar 13 12:18:33 crc kubenswrapper[4837]: I0313 12:18:33.641069 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4ddcb794-ab03-4308-a93c-c5929ed96e01-inventory-0\") pod \"4ddcb794-ab03-4308-a93c-c5929ed96e01\" (UID: \"4ddcb794-ab03-4308-a93c-c5929ed96e01\") " Mar 13 12:18:33 crc kubenswrapper[4837]: I0313 12:18:33.641129 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt4ls\" (UniqueName: \"kubernetes.io/projected/4ddcb794-ab03-4308-a93c-c5929ed96e01-kube-api-access-bt4ls\") pod \"4ddcb794-ab03-4308-a93c-c5929ed96e01\" (UID: \"4ddcb794-ab03-4308-a93c-c5929ed96e01\") " Mar 13 12:18:33 crc kubenswrapper[4837]: I0313 12:18:33.648852 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ddcb794-ab03-4308-a93c-c5929ed96e01-kube-api-access-bt4ls" (OuterVolumeSpecName: "kube-api-access-bt4ls") pod "4ddcb794-ab03-4308-a93c-c5929ed96e01" (UID: "4ddcb794-ab03-4308-a93c-c5929ed96e01"). InnerVolumeSpecName "kube-api-access-bt4ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:18:33 crc kubenswrapper[4837]: I0313 12:18:33.668981 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ddcb794-ab03-4308-a93c-c5929ed96e01-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4ddcb794-ab03-4308-a93c-c5929ed96e01" (UID: "4ddcb794-ab03-4308-a93c-c5929ed96e01"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:18:33 crc kubenswrapper[4837]: I0313 12:18:33.672810 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ddcb794-ab03-4308-a93c-c5929ed96e01-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "4ddcb794-ab03-4308-a93c-c5929ed96e01" (UID: "4ddcb794-ab03-4308-a93c-c5929ed96e01"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:18:33 crc kubenswrapper[4837]: I0313 12:18:33.743507 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt4ls\" (UniqueName: \"kubernetes.io/projected/4ddcb794-ab03-4308-a93c-c5929ed96e01-kube-api-access-bt4ls\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:33 crc kubenswrapper[4837]: I0313 12:18:33.743543 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ddcb794-ab03-4308-a93c-c5929ed96e01-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:33 crc kubenswrapper[4837]: I0313 12:18:33.743553 4837 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4ddcb794-ab03-4308-a93c-c5929ed96e01-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.146967 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" event={"ID":"4ddcb794-ab03-4308-a93c-c5929ed96e01","Type":"ContainerDied","Data":"451be746bf2063e51cbed79829c6492e0e00242a709c0fe8864eca7fe5d169bc"} Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.147017 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="451be746bf2063e51cbed79829c6492e0e00242a709c0fe8864eca7fe5d169bc" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.147409 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.264191 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp"] Mar 13 12:18:34 crc kubenswrapper[4837]: E0313 12:18:34.264742 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ddcb794-ab03-4308-a93c-c5929ed96e01" containerName="ssh-known-hosts-edpm-deployment" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.264760 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ddcb794-ab03-4308-a93c-c5929ed96e01" containerName="ssh-known-hosts-edpm-deployment" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.264981 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ddcb794-ab03-4308-a93c-c5929ed96e01" containerName="ssh-known-hosts-edpm-deployment" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.265741 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.269169 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.269192 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.269465 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.269912 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.278167 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp"] Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.353949 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh6xr\" (UniqueName: \"kubernetes.io/projected/f12ac62a-2011-4e89-a16f-e136959f9d1a-kube-api-access-jh6xr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s6jdp\" (UID: \"f12ac62a-2011-4e89-a16f-e136959f9d1a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.354024 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f12ac62a-2011-4e89-a16f-e136959f9d1a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s6jdp\" (UID: \"f12ac62a-2011-4e89-a16f-e136959f9d1a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.354095 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f12ac62a-2011-4e89-a16f-e136959f9d1a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s6jdp\" (UID: \"f12ac62a-2011-4e89-a16f-e136959f9d1a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.455614 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f12ac62a-2011-4e89-a16f-e136959f9d1a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s6jdp\" (UID: \"f12ac62a-2011-4e89-a16f-e136959f9d1a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.455773 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f12ac62a-2011-4e89-a16f-e136959f9d1a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s6jdp\" (UID: \"f12ac62a-2011-4e89-a16f-e136959f9d1a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.455870 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh6xr\" (UniqueName: \"kubernetes.io/projected/f12ac62a-2011-4e89-a16f-e136959f9d1a-kube-api-access-jh6xr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s6jdp\" (UID: \"f12ac62a-2011-4e89-a16f-e136959f9d1a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.461181 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f12ac62a-2011-4e89-a16f-e136959f9d1a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s6jdp\" (UID: \"f12ac62a-2011-4e89-a16f-e136959f9d1a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.461962 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f12ac62a-2011-4e89-a16f-e136959f9d1a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s6jdp\" (UID: \"f12ac62a-2011-4e89-a16f-e136959f9d1a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.484795 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh6xr\" (UniqueName: \"kubernetes.io/projected/f12ac62a-2011-4e89-a16f-e136959f9d1a-kube-api-access-jh6xr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s6jdp\" (UID: \"f12ac62a-2011-4e89-a16f-e136959f9d1a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.581729 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.880284 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp"] Mar 13 12:18:35 crc kubenswrapper[4837]: I0313 12:18:35.155812 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" event={"ID":"f12ac62a-2011-4e89-a16f-e136959f9d1a","Type":"ContainerStarted","Data":"2ea8a7545c2f490a975716ffc5914ba1d4d1313b87599c519dda0edd29cdc7cb"} Mar 13 12:18:36 crc kubenswrapper[4837]: I0313 12:18:36.166001 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" event={"ID":"f12ac62a-2011-4e89-a16f-e136959f9d1a","Type":"ContainerStarted","Data":"5b0e6698b907c693465a3b16020d70fbda7db26663c7b84f6365068fdb5d08bd"} Mar 13 12:18:36 crc kubenswrapper[4837]: I0313 12:18:36.189421 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" podStartSLOduration=1.739640563 podStartE2EDuration="2.189400488s" podCreationTimestamp="2026-03-13 12:18:34 +0000 UTC" firstStartedPulling="2026-03-13 12:18:34.897707132 +0000 UTC m=+1830.535973895" lastFinishedPulling="2026-03-13 12:18:35.347467057 +0000 UTC m=+1830.985733820" observedRunningTime="2026-03-13 12:18:36.182041837 +0000 UTC m=+1831.820308600" watchObservedRunningTime="2026-03-13 12:18:36.189400488 +0000 UTC m=+1831.827667251" Mar 13 12:18:43 crc kubenswrapper[4837]: I0313 12:18:43.048958 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:18:43 crc kubenswrapper[4837]: E0313 12:18:43.049850 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:18:43 crc kubenswrapper[4837]: I0313 12:18:43.239292 4837 generic.go:334] "Generic (PLEG): container finished" podID="f12ac62a-2011-4e89-a16f-e136959f9d1a" containerID="5b0e6698b907c693465a3b16020d70fbda7db26663c7b84f6365068fdb5d08bd" exitCode=0 Mar 13 12:18:43 crc kubenswrapper[4837]: I0313 12:18:43.239337 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" event={"ID":"f12ac62a-2011-4e89-a16f-e136959f9d1a","Type":"ContainerDied","Data":"5b0e6698b907c693465a3b16020d70fbda7db26663c7b84f6365068fdb5d08bd"} Mar 13 12:18:44 crc kubenswrapper[4837]: I0313 12:18:44.650747 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:44 crc kubenswrapper[4837]: I0313 12:18:44.846444 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh6xr\" (UniqueName: \"kubernetes.io/projected/f12ac62a-2011-4e89-a16f-e136959f9d1a-kube-api-access-jh6xr\") pod \"f12ac62a-2011-4e89-a16f-e136959f9d1a\" (UID: \"f12ac62a-2011-4e89-a16f-e136959f9d1a\") " Mar 13 12:18:44 crc kubenswrapper[4837]: I0313 12:18:44.846563 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f12ac62a-2011-4e89-a16f-e136959f9d1a-ssh-key-openstack-edpm-ipam\") pod \"f12ac62a-2011-4e89-a16f-e136959f9d1a\" (UID: \"f12ac62a-2011-4e89-a16f-e136959f9d1a\") " Mar 13 12:18:44 crc kubenswrapper[4837]: I0313 12:18:44.846669 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f12ac62a-2011-4e89-a16f-e136959f9d1a-inventory\") pod \"f12ac62a-2011-4e89-a16f-e136959f9d1a\" (UID: \"f12ac62a-2011-4e89-a16f-e136959f9d1a\") " Mar 13 12:18:44 crc kubenswrapper[4837]: I0313 12:18:44.853074 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f12ac62a-2011-4e89-a16f-e136959f9d1a-kube-api-access-jh6xr" (OuterVolumeSpecName: "kube-api-access-jh6xr") pod "f12ac62a-2011-4e89-a16f-e136959f9d1a" (UID: "f12ac62a-2011-4e89-a16f-e136959f9d1a"). InnerVolumeSpecName "kube-api-access-jh6xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:18:44 crc kubenswrapper[4837]: I0313 12:18:44.877922 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f12ac62a-2011-4e89-a16f-e136959f9d1a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f12ac62a-2011-4e89-a16f-e136959f9d1a" (UID: "f12ac62a-2011-4e89-a16f-e136959f9d1a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:18:44 crc kubenswrapper[4837]: I0313 12:18:44.881917 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f12ac62a-2011-4e89-a16f-e136959f9d1a-inventory" (OuterVolumeSpecName: "inventory") pod "f12ac62a-2011-4e89-a16f-e136959f9d1a" (UID: "f12ac62a-2011-4e89-a16f-e136959f9d1a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:18:44 crc kubenswrapper[4837]: I0313 12:18:44.948862 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh6xr\" (UniqueName: \"kubernetes.io/projected/f12ac62a-2011-4e89-a16f-e136959f9d1a-kube-api-access-jh6xr\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:44 crc kubenswrapper[4837]: I0313 12:18:44.948907 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f12ac62a-2011-4e89-a16f-e136959f9d1a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:44 crc kubenswrapper[4837]: I0313 12:18:44.948921 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f12ac62a-2011-4e89-a16f-e136959f9d1a-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.278269 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" event={"ID":"f12ac62a-2011-4e89-a16f-e136959f9d1a","Type":"ContainerDied","Data":"2ea8a7545c2f490a975716ffc5914ba1d4d1313b87599c519dda0edd29cdc7cb"} Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.278314 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ea8a7545c2f490a975716ffc5914ba1d4d1313b87599c519dda0edd29cdc7cb" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.278370 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.358494 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d"] Mar 13 12:18:45 crc kubenswrapper[4837]: E0313 12:18:45.359281 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f12ac62a-2011-4e89-a16f-e136959f9d1a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.359306 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f12ac62a-2011-4e89-a16f-e136959f9d1a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.359577 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f12ac62a-2011-4e89-a16f-e136959f9d1a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.360372 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.368902 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d"] Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.376112 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.376153 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.376288 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.376301 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.457723 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv2sm\" (UniqueName: \"kubernetes.io/projected/3b96ea7e-2148-4659-9a26-3335c88888c1-kube-api-access-nv2sm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d\" (UID: \"3b96ea7e-2148-4659-9a26-3335c88888c1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.457818 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b96ea7e-2148-4659-9a26-3335c88888c1-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d\" (UID: \"3b96ea7e-2148-4659-9a26-3335c88888c1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.457908 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b96ea7e-2148-4659-9a26-3335c88888c1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d\" (UID: \"3b96ea7e-2148-4659-9a26-3335c88888c1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.559413 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b96ea7e-2148-4659-9a26-3335c88888c1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d\" (UID: \"3b96ea7e-2148-4659-9a26-3335c88888c1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.559605 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv2sm\" (UniqueName: \"kubernetes.io/projected/3b96ea7e-2148-4659-9a26-3335c88888c1-kube-api-access-nv2sm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d\" (UID: \"3b96ea7e-2148-4659-9a26-3335c88888c1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.559652 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b96ea7e-2148-4659-9a26-3335c88888c1-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d\" (UID: \"3b96ea7e-2148-4659-9a26-3335c88888c1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.564274 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b96ea7e-2148-4659-9a26-3335c88888c1-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d\" (UID: \"3b96ea7e-2148-4659-9a26-3335c88888c1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.565880 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b96ea7e-2148-4659-9a26-3335c88888c1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d\" (UID: \"3b96ea7e-2148-4659-9a26-3335c88888c1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.577688 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv2sm\" (UniqueName: \"kubernetes.io/projected/3b96ea7e-2148-4659-9a26-3335c88888c1-kube-api-access-nv2sm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d\" (UID: \"3b96ea7e-2148-4659-9a26-3335c88888c1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.701567 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:46 crc kubenswrapper[4837]: I0313 12:18:46.196279 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d"] Mar 13 12:18:46 crc kubenswrapper[4837]: I0313 12:18:46.285906 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" event={"ID":"3b96ea7e-2148-4659-9a26-3335c88888c1","Type":"ContainerStarted","Data":"0bc2be67d1432fb24a9622d1d6fa190c835e77bf54844cd1ef72245a818d45fe"} Mar 13 12:18:47 crc kubenswrapper[4837]: I0313 12:18:47.298321 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" event={"ID":"3b96ea7e-2148-4659-9a26-3335c88888c1","Type":"ContainerStarted","Data":"f37ec99eae382be87b274f7fe9869f814c866a0ac73b90f403cedf878941c703"} Mar 13 12:18:47 crc kubenswrapper[4837]: I0313 12:18:47.322980 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" podStartSLOduration=1.916345562 podStartE2EDuration="2.322960751s" podCreationTimestamp="2026-03-13 12:18:45 +0000 UTC" firstStartedPulling="2026-03-13 12:18:46.202222008 +0000 UTC m=+1841.840488771" lastFinishedPulling="2026-03-13 12:18:46.608837177 +0000 UTC m=+1842.247103960" observedRunningTime="2026-03-13 12:18:47.318526581 +0000 UTC m=+1842.956793344" watchObservedRunningTime="2026-03-13 12:18:47.322960751 +0000 UTC m=+1842.961227514" Mar 13 12:18:50 crc kubenswrapper[4837]: I0313 12:18:50.054364 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-xlps2"] Mar 13 12:18:50 crc kubenswrapper[4837]: I0313 12:18:50.073683 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8mzt4"] Mar 13 12:18:50 crc kubenswrapper[4837]: I0313 12:18:50.084759 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-xlps2"] Mar 13 12:18:50 crc kubenswrapper[4837]: I0313 12:18:50.092335 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8mzt4"] Mar 13 12:18:51 crc kubenswrapper[4837]: I0313 12:18:51.060690 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02b82791-6ef3-4a93-9d5a-84065d62775d" path="/var/lib/kubelet/pods/02b82791-6ef3-4a93-9d5a-84065d62775d/volumes" Mar 13 12:18:51 crc kubenswrapper[4837]: I0313 12:18:51.061377 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53268342-9adb-48b3-ba5b-52634c2c68fe" path="/var/lib/kubelet/pods/53268342-9adb-48b3-ba5b-52634c2c68fe/volumes" Mar 13 12:18:55 crc kubenswrapper[4837]: I0313 12:18:55.362755 4837 generic.go:334] "Generic (PLEG): container finished" podID="3b96ea7e-2148-4659-9a26-3335c88888c1" containerID="f37ec99eae382be87b274f7fe9869f814c866a0ac73b90f403cedf878941c703" exitCode=0 Mar 13 12:18:55 crc kubenswrapper[4837]: I0313 12:18:55.362835 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" event={"ID":"3b96ea7e-2148-4659-9a26-3335c88888c1","Type":"ContainerDied","Data":"f37ec99eae382be87b274f7fe9869f814c866a0ac73b90f403cedf878941c703"} Mar 13 12:18:56 crc kubenswrapper[4837]: I0313 12:18:56.895043 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.049415 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:18:57 crc kubenswrapper[4837]: E0313 12:18:57.053741 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.073502 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv2sm\" (UniqueName: \"kubernetes.io/projected/3b96ea7e-2148-4659-9a26-3335c88888c1-kube-api-access-nv2sm\") pod \"3b96ea7e-2148-4659-9a26-3335c88888c1\" (UID: \"3b96ea7e-2148-4659-9a26-3335c88888c1\") " Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.073789 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b96ea7e-2148-4659-9a26-3335c88888c1-inventory\") pod \"3b96ea7e-2148-4659-9a26-3335c88888c1\" (UID: \"3b96ea7e-2148-4659-9a26-3335c88888c1\") " Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.073823 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b96ea7e-2148-4659-9a26-3335c88888c1-ssh-key-openstack-edpm-ipam\") pod \"3b96ea7e-2148-4659-9a26-3335c88888c1\" (UID: \"3b96ea7e-2148-4659-9a26-3335c88888c1\") " Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.079770 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b96ea7e-2148-4659-9a26-3335c88888c1-kube-api-access-nv2sm" (OuterVolumeSpecName: "kube-api-access-nv2sm") pod "3b96ea7e-2148-4659-9a26-3335c88888c1" (UID: "3b96ea7e-2148-4659-9a26-3335c88888c1"). InnerVolumeSpecName "kube-api-access-nv2sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.102005 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b96ea7e-2148-4659-9a26-3335c88888c1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3b96ea7e-2148-4659-9a26-3335c88888c1" (UID: "3b96ea7e-2148-4659-9a26-3335c88888c1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.102291 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b96ea7e-2148-4659-9a26-3335c88888c1-inventory" (OuterVolumeSpecName: "inventory") pod "3b96ea7e-2148-4659-9a26-3335c88888c1" (UID: "3b96ea7e-2148-4659-9a26-3335c88888c1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.177384 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b96ea7e-2148-4659-9a26-3335c88888c1-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.177428 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b96ea7e-2148-4659-9a26-3335c88888c1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.177445 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nv2sm\" (UniqueName: \"kubernetes.io/projected/3b96ea7e-2148-4659-9a26-3335c88888c1-kube-api-access-nv2sm\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.385913 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" event={"ID":"3b96ea7e-2148-4659-9a26-3335c88888c1","Type":"ContainerDied","Data":"0bc2be67d1432fb24a9622d1d6fa190c835e77bf54844cd1ef72245a818d45fe"} Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.385957 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bc2be67d1432fb24a9622d1d6fa190c835e77bf54844cd1ef72245a818d45fe" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.385974 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.472370 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c"] Mar 13 12:18:57 crc kubenswrapper[4837]: E0313 12:18:57.473177 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b96ea7e-2148-4659-9a26-3335c88888c1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.473300 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b96ea7e-2148-4659-9a26-3335c88888c1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.473581 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b96ea7e-2148-4659-9a26-3335c88888c1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.474458 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.481614 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.482017 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.482311 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.482274 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.482936 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.483059 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.483176 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.483229 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.487452 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c"] Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.587810 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.588152 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.588207 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.588227 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.588448 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.588548 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.588577 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.588621 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75mvx\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-kube-api-access-75mvx\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.588669 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.588704 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.588822 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.588900 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.588987 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.589014 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691112 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691176 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691238 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75mvx\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-kube-api-access-75mvx\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691264 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691290 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691346 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691384 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691432 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691457 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691484 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691531 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691591 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691617 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691701 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.696695 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.696911 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.698030 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.698443 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.698882 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.699039 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.699105 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.699912 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.700351 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.700621 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.700741 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.702397 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.703965 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.711502 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75mvx\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-kube-api-access-75mvx\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.800491 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:58 crc kubenswrapper[4837]: I0313 12:18:58.331104 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c"] Mar 13 12:18:58 crc kubenswrapper[4837]: I0313 12:18:58.393973 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" event={"ID":"6cc8d0dd-d1e6-4374-bb90-aaefc9197350","Type":"ContainerStarted","Data":"a9a004fb6e650fe374173e7535e9f528dd1cc37af26ae43f015e8366167fa211"} Mar 13 12:18:59 crc kubenswrapper[4837]: I0313 12:18:59.403542 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" event={"ID":"6cc8d0dd-d1e6-4374-bb90-aaefc9197350","Type":"ContainerStarted","Data":"317902596cfacd73a102a6829fe35a61e885c73f5b273e8ac5d10209c855380d"} Mar 13 12:18:59 crc kubenswrapper[4837]: I0313 12:18:59.421112 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" podStartSLOduration=1.994401026 podStartE2EDuration="2.421091047s" podCreationTimestamp="2026-03-13 12:18:57 +0000 UTC" firstStartedPulling="2026-03-13 12:18:58.332667299 +0000 UTC m=+1853.970934052" lastFinishedPulling="2026-03-13 12:18:58.75935731 +0000 UTC m=+1854.397624073" observedRunningTime="2026-03-13 12:18:59.417573486 +0000 UTC m=+1855.055840249" watchObservedRunningTime="2026-03-13 12:18:59.421091047 +0000 UTC m=+1855.059357810" Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.410180 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b85kp"] Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.413112 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.425049 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b85kp"] Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.601176 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q229j\" (UniqueName: \"kubernetes.io/projected/9900be86-1923-4036-bccc-7e9c0484fb4c-kube-api-access-q229j\") pod \"redhat-operators-b85kp\" (UID: \"9900be86-1923-4036-bccc-7e9c0484fb4c\") " pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.601309 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9900be86-1923-4036-bccc-7e9c0484fb4c-utilities\") pod \"redhat-operators-b85kp\" (UID: \"9900be86-1923-4036-bccc-7e9c0484fb4c\") " pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.601363 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9900be86-1923-4036-bccc-7e9c0484fb4c-catalog-content\") pod \"redhat-operators-b85kp\" (UID: \"9900be86-1923-4036-bccc-7e9c0484fb4c\") " pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.704402 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q229j\" (UniqueName: \"kubernetes.io/projected/9900be86-1923-4036-bccc-7e9c0484fb4c-kube-api-access-q229j\") pod \"redhat-operators-b85kp\" (UID: \"9900be86-1923-4036-bccc-7e9c0484fb4c\") " pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.704509 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9900be86-1923-4036-bccc-7e9c0484fb4c-utilities\") pod \"redhat-operators-b85kp\" (UID: \"9900be86-1923-4036-bccc-7e9c0484fb4c\") " pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.704553 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9900be86-1923-4036-bccc-7e9c0484fb4c-catalog-content\") pod \"redhat-operators-b85kp\" (UID: \"9900be86-1923-4036-bccc-7e9c0484fb4c\") " pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.705259 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9900be86-1923-4036-bccc-7e9c0484fb4c-catalog-content\") pod \"redhat-operators-b85kp\" (UID: \"9900be86-1923-4036-bccc-7e9c0484fb4c\") " pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.705675 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9900be86-1923-4036-bccc-7e9c0484fb4c-utilities\") pod \"redhat-operators-b85kp\" (UID: \"9900be86-1923-4036-bccc-7e9c0484fb4c\") " pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.726422 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q229j\" (UniqueName: \"kubernetes.io/projected/9900be86-1923-4036-bccc-7e9c0484fb4c-kube-api-access-q229j\") pod \"redhat-operators-b85kp\" (UID: \"9900be86-1923-4036-bccc-7e9c0484fb4c\") " pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.739707 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.194771 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b85kp"] Mar 13 12:19:09 crc kubenswrapper[4837]: W0313 12:19:09.199706 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9900be86_1923_4036_bccc_7e9c0484fb4c.slice/crio-ca581bf39f1e9f8c7e41b890048bfe29dcb34e40e5c64cbcb1b880ad113b0f6d WatchSource:0}: Error finding container ca581bf39f1e9f8c7e41b890048bfe29dcb34e40e5c64cbcb1b880ad113b0f6d: Status 404 returned error can't find the container with id ca581bf39f1e9f8c7e41b890048bfe29dcb34e40e5c64cbcb1b880ad113b0f6d Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.495565 4837 generic.go:334] "Generic (PLEG): container finished" podID="9900be86-1923-4036-bccc-7e9c0484fb4c" containerID="6fab948827daa6c53b19a10ed677270420a413c7c36a1256f2a3b3246a0b99e4" exitCode=0 Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.495611 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b85kp" event={"ID":"9900be86-1923-4036-bccc-7e9c0484fb4c","Type":"ContainerDied","Data":"6fab948827daa6c53b19a10ed677270420a413c7c36a1256f2a3b3246a0b99e4"} Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.495653 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b85kp" event={"ID":"9900be86-1923-4036-bccc-7e9c0484fb4c","Type":"ContainerStarted","Data":"ca581bf39f1e9f8c7e41b890048bfe29dcb34e40e5c64cbcb1b880ad113b0f6d"} Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.499479 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.813986 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tsfbn"] Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.816607 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.825961 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tsfbn"] Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.827802 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-utilities\") pod \"certified-operators-tsfbn\" (UID: \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\") " pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.827859 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-catalog-content\") pod \"certified-operators-tsfbn\" (UID: \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\") " pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.827999 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml6fk\" (UniqueName: \"kubernetes.io/projected/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-kube-api-access-ml6fk\") pod \"certified-operators-tsfbn\" (UID: \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\") " pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.930148 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml6fk\" (UniqueName: \"kubernetes.io/projected/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-kube-api-access-ml6fk\") pod \"certified-operators-tsfbn\" (UID: \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\") " pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.930292 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-utilities\") pod \"certified-operators-tsfbn\" (UID: \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\") " pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.930326 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-catalog-content\") pod \"certified-operators-tsfbn\" (UID: \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\") " pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.931069 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-utilities\") pod \"certified-operators-tsfbn\" (UID: \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\") " pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.931139 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-catalog-content\") pod \"certified-operators-tsfbn\" (UID: \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\") " pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.954184 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml6fk\" (UniqueName: \"kubernetes.io/projected/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-kube-api-access-ml6fk\") pod \"certified-operators-tsfbn\" (UID: \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\") " pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:10 crc kubenswrapper[4837]: I0313 12:19:10.049129 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:19:10 crc kubenswrapper[4837]: E0313 12:19:10.049422 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:19:10 crc kubenswrapper[4837]: I0313 12:19:10.135581 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:10 crc kubenswrapper[4837]: I0313 12:19:10.516160 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b85kp" event={"ID":"9900be86-1923-4036-bccc-7e9c0484fb4c","Type":"ContainerStarted","Data":"053200fa75a090695d23d906d7072bcc128ba23e140608f0676e9047dd0b4f6e"} Mar 13 12:19:10 crc kubenswrapper[4837]: I0313 12:19:10.717849 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tsfbn"] Mar 13 12:19:11 crc kubenswrapper[4837]: I0313 12:19:11.530206 4837 generic.go:334] "Generic (PLEG): container finished" podID="9900be86-1923-4036-bccc-7e9c0484fb4c" containerID="053200fa75a090695d23d906d7072bcc128ba23e140608f0676e9047dd0b4f6e" exitCode=0 Mar 13 12:19:11 crc kubenswrapper[4837]: I0313 12:19:11.530279 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b85kp" event={"ID":"9900be86-1923-4036-bccc-7e9c0484fb4c","Type":"ContainerDied","Data":"053200fa75a090695d23d906d7072bcc128ba23e140608f0676e9047dd0b4f6e"} Mar 13 12:19:11 crc kubenswrapper[4837]: I0313 12:19:11.547398 4837 generic.go:334] "Generic (PLEG): container finished" podID="d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" containerID="adad9381bc6a39ddbdad6c4301cbb23bc8c90b91950618f3b2fe7fc956cf30c4" exitCode=0 Mar 13 12:19:11 crc kubenswrapper[4837]: I0313 12:19:11.547440 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsfbn" event={"ID":"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a","Type":"ContainerDied","Data":"adad9381bc6a39ddbdad6c4301cbb23bc8c90b91950618f3b2fe7fc956cf30c4"} Mar 13 12:19:11 crc kubenswrapper[4837]: I0313 12:19:11.547472 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsfbn" event={"ID":"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a","Type":"ContainerStarted","Data":"9cdfda55cf58dcae44b171ff87d0d9876fe414823d1ec7d0b1b7ed1df6f59fe5"} Mar 13 12:19:13 crc kubenswrapper[4837]: I0313 12:19:13.567904 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b85kp" event={"ID":"9900be86-1923-4036-bccc-7e9c0484fb4c","Type":"ContainerStarted","Data":"b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd"} Mar 13 12:19:13 crc kubenswrapper[4837]: I0313 12:19:13.571600 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsfbn" event={"ID":"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a","Type":"ContainerStarted","Data":"50cefa7392ae5d869e28ff046f43607d22feea06cc84d09928cf9eb7cc27bc7c"} Mar 13 12:19:13 crc kubenswrapper[4837]: I0313 12:19:13.593384 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b85kp" podStartSLOduration=2.710451386 podStartE2EDuration="5.593362852s" podCreationTimestamp="2026-03-13 12:19:08 +0000 UTC" firstStartedPulling="2026-03-13 12:19:09.499194528 +0000 UTC m=+1865.137461291" lastFinishedPulling="2026-03-13 12:19:12.382105994 +0000 UTC m=+1868.020372757" observedRunningTime="2026-03-13 12:19:13.583958596 +0000 UTC m=+1869.222225369" watchObservedRunningTime="2026-03-13 12:19:13.593362852 +0000 UTC m=+1869.231629615" Mar 13 12:19:14 crc kubenswrapper[4837]: I0313 12:19:14.584472 4837 generic.go:334] "Generic (PLEG): container finished" podID="d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" containerID="50cefa7392ae5d869e28ff046f43607d22feea06cc84d09928cf9eb7cc27bc7c" exitCode=0 Mar 13 12:19:14 crc kubenswrapper[4837]: I0313 12:19:14.584590 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsfbn" event={"ID":"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a","Type":"ContainerDied","Data":"50cefa7392ae5d869e28ff046f43607d22feea06cc84d09928cf9eb7cc27bc7c"} Mar 13 12:19:16 crc kubenswrapper[4837]: I0313 12:19:16.602281 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsfbn" event={"ID":"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a","Type":"ContainerStarted","Data":"bfa2c9a9825342e83fc8ddcd92d9af1ebbe4e83ce8898f8cdf46504fa5d9b0b9"} Mar 13 12:19:16 crc kubenswrapper[4837]: I0313 12:19:16.630474 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tsfbn" podStartSLOduration=3.180937052 podStartE2EDuration="7.630450514s" podCreationTimestamp="2026-03-13 12:19:09 +0000 UTC" firstStartedPulling="2026-03-13 12:19:11.55120443 +0000 UTC m=+1867.189471203" lastFinishedPulling="2026-03-13 12:19:16.000717902 +0000 UTC m=+1871.638984665" observedRunningTime="2026-03-13 12:19:16.62236954 +0000 UTC m=+1872.260636293" watchObservedRunningTime="2026-03-13 12:19:16.630450514 +0000 UTC m=+1872.268717277" Mar 13 12:19:18 crc kubenswrapper[4837]: I0313 12:19:18.740749 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:18 crc kubenswrapper[4837]: I0313 12:19:18.741744 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:19 crc kubenswrapper[4837]: I0313 12:19:19.805216 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b85kp" podUID="9900be86-1923-4036-bccc-7e9c0484fb4c" containerName="registry-server" probeResult="failure" output=< Mar 13 12:19:19 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Mar 13 12:19:19 crc kubenswrapper[4837]: > Mar 13 12:19:20 crc kubenswrapper[4837]: I0313 12:19:20.136757 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:20 crc kubenswrapper[4837]: I0313 12:19:20.136819 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:20 crc kubenswrapper[4837]: I0313 12:19:20.182657 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:25 crc kubenswrapper[4837]: I0313 12:19:25.054091 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:19:25 crc kubenswrapper[4837]: E0313 12:19:25.055035 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:19:28 crc kubenswrapper[4837]: I0313 12:19:28.784625 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:28 crc kubenswrapper[4837]: I0313 12:19:28.834810 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:29 crc kubenswrapper[4837]: I0313 12:19:29.022963 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b85kp"] Mar 13 12:19:29 crc kubenswrapper[4837]: I0313 12:19:29.763948 4837 scope.go:117] "RemoveContainer" containerID="deea73f54571ed1f4517906256e112c93e642ebacb77d1a62a53b5217eb1d25c" Mar 13 12:19:29 crc kubenswrapper[4837]: I0313 12:19:29.803967 4837 scope.go:117] "RemoveContainer" containerID="5337a2212bdc3b1dbf150fa95afc9aaae420bfce797da10558e36cb08bd46c77" Mar 13 12:19:29 crc kubenswrapper[4837]: I0313 12:19:29.844321 4837 scope.go:117] "RemoveContainer" containerID="5e2fe1dde876f5e43e3e8ce2528c539e8504cc8726824d4a38da88b3f10df140" Mar 13 12:19:30 crc kubenswrapper[4837]: I0313 12:19:30.184734 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:30 crc kubenswrapper[4837]: I0313 12:19:30.725162 4837 generic.go:334] "Generic (PLEG): container finished" podID="6cc8d0dd-d1e6-4374-bb90-aaefc9197350" containerID="317902596cfacd73a102a6829fe35a61e885c73f5b273e8ac5d10209c855380d" exitCode=0 Mar 13 12:19:30 crc kubenswrapper[4837]: I0313 12:19:30.725355 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b85kp" podUID="9900be86-1923-4036-bccc-7e9c0484fb4c" containerName="registry-server" containerID="cri-o://b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd" gracePeriod=2 Mar 13 12:19:30 crc kubenswrapper[4837]: I0313 12:19:30.725596 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" event={"ID":"6cc8d0dd-d1e6-4374-bb90-aaefc9197350","Type":"ContainerDied","Data":"317902596cfacd73a102a6829fe35a61e885c73f5b273e8ac5d10209c855380d"} Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.427406 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tsfbn"] Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.427971 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tsfbn" podUID="d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" containerName="registry-server" containerID="cri-o://bfa2c9a9825342e83fc8ddcd92d9af1ebbe4e83ce8898f8cdf46504fa5d9b0b9" gracePeriod=2 Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.698747 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.735078 4837 generic.go:334] "Generic (PLEG): container finished" podID="9900be86-1923-4036-bccc-7e9c0484fb4c" containerID="b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd" exitCode=0 Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.735142 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b85kp" event={"ID":"9900be86-1923-4036-bccc-7e9c0484fb4c","Type":"ContainerDied","Data":"b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd"} Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.735169 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b85kp" event={"ID":"9900be86-1923-4036-bccc-7e9c0484fb4c","Type":"ContainerDied","Data":"ca581bf39f1e9f8c7e41b890048bfe29dcb34e40e5c64cbcb1b880ad113b0f6d"} Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.735185 4837 scope.go:117] "RemoveContainer" containerID="b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.735201 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.737466 4837 generic.go:334] "Generic (PLEG): container finished" podID="d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" containerID="bfa2c9a9825342e83fc8ddcd92d9af1ebbe4e83ce8898f8cdf46504fa5d9b0b9" exitCode=0 Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.737619 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsfbn" event={"ID":"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a","Type":"ContainerDied","Data":"bfa2c9a9825342e83fc8ddcd92d9af1ebbe4e83ce8898f8cdf46504fa5d9b0b9"} Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.758087 4837 scope.go:117] "RemoveContainer" containerID="053200fa75a090695d23d906d7072bcc128ba23e140608f0676e9047dd0b4f6e" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.817193 4837 scope.go:117] "RemoveContainer" containerID="6fab948827daa6c53b19a10ed677270420a413c7c36a1256f2a3b3246a0b99e4" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.840331 4837 scope.go:117] "RemoveContainer" containerID="b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd" Mar 13 12:19:31 crc kubenswrapper[4837]: E0313 12:19:31.842156 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd\": container with ID starting with b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd not found: ID does not exist" containerID="b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.842210 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd"} err="failed to get container status \"b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd\": rpc error: code = NotFound desc = could not find container \"b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd\": container with ID starting with b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd not found: ID does not exist" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.842239 4837 scope.go:117] "RemoveContainer" containerID="053200fa75a090695d23d906d7072bcc128ba23e140608f0676e9047dd0b4f6e" Mar 13 12:19:31 crc kubenswrapper[4837]: E0313 12:19:31.842478 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"053200fa75a090695d23d906d7072bcc128ba23e140608f0676e9047dd0b4f6e\": container with ID starting with 053200fa75a090695d23d906d7072bcc128ba23e140608f0676e9047dd0b4f6e not found: ID does not exist" containerID="053200fa75a090695d23d906d7072bcc128ba23e140608f0676e9047dd0b4f6e" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.842510 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"053200fa75a090695d23d906d7072bcc128ba23e140608f0676e9047dd0b4f6e"} err="failed to get container status \"053200fa75a090695d23d906d7072bcc128ba23e140608f0676e9047dd0b4f6e\": rpc error: code = NotFound desc = could not find container \"053200fa75a090695d23d906d7072bcc128ba23e140608f0676e9047dd0b4f6e\": container with ID starting with 053200fa75a090695d23d906d7072bcc128ba23e140608f0676e9047dd0b4f6e not found: ID does not exist" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.842527 4837 scope.go:117] "RemoveContainer" containerID="6fab948827daa6c53b19a10ed677270420a413c7c36a1256f2a3b3246a0b99e4" Mar 13 12:19:31 crc kubenswrapper[4837]: E0313 12:19:31.842981 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fab948827daa6c53b19a10ed677270420a413c7c36a1256f2a3b3246a0b99e4\": container with ID starting with 6fab948827daa6c53b19a10ed677270420a413c7c36a1256f2a3b3246a0b99e4 not found: ID does not exist" containerID="6fab948827daa6c53b19a10ed677270420a413c7c36a1256f2a3b3246a0b99e4" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.843004 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fab948827daa6c53b19a10ed677270420a413c7c36a1256f2a3b3246a0b99e4"} err="failed to get container status \"6fab948827daa6c53b19a10ed677270420a413c7c36a1256f2a3b3246a0b99e4\": rpc error: code = NotFound desc = could not find container \"6fab948827daa6c53b19a10ed677270420a413c7c36a1256f2a3b3246a0b99e4\": container with ID starting with 6fab948827daa6c53b19a10ed677270420a413c7c36a1256f2a3b3246a0b99e4 not found: ID does not exist" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.859827 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q229j\" (UniqueName: \"kubernetes.io/projected/9900be86-1923-4036-bccc-7e9c0484fb4c-kube-api-access-q229j\") pod \"9900be86-1923-4036-bccc-7e9c0484fb4c\" (UID: \"9900be86-1923-4036-bccc-7e9c0484fb4c\") " Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.859963 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9900be86-1923-4036-bccc-7e9c0484fb4c-catalog-content\") pod \"9900be86-1923-4036-bccc-7e9c0484fb4c\" (UID: \"9900be86-1923-4036-bccc-7e9c0484fb4c\") " Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.860023 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9900be86-1923-4036-bccc-7e9c0484fb4c-utilities\") pod \"9900be86-1923-4036-bccc-7e9c0484fb4c\" (UID: \"9900be86-1923-4036-bccc-7e9c0484fb4c\") " Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.860687 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9900be86-1923-4036-bccc-7e9c0484fb4c-utilities" (OuterVolumeSpecName: "utilities") pod "9900be86-1923-4036-bccc-7e9c0484fb4c" (UID: "9900be86-1923-4036-bccc-7e9c0484fb4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.866602 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9900be86-1923-4036-bccc-7e9c0484fb4c-kube-api-access-q229j" (OuterVolumeSpecName: "kube-api-access-q229j") pod "9900be86-1923-4036-bccc-7e9c0484fb4c" (UID: "9900be86-1923-4036-bccc-7e9c0484fb4c"). InnerVolumeSpecName "kube-api-access-q229j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.931870 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.963492 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q229j\" (UniqueName: \"kubernetes.io/projected/9900be86-1923-4036-bccc-7e9c0484fb4c-kube-api-access-q229j\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.963541 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9900be86-1923-4036-bccc-7e9c0484fb4c-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.005948 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9900be86-1923-4036-bccc-7e9c0484fb4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9900be86-1923-4036-bccc-7e9c0484fb4c" (UID: "9900be86-1923-4036-bccc-7e9c0484fb4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.065158 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml6fk\" (UniqueName: \"kubernetes.io/projected/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-kube-api-access-ml6fk\") pod \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\" (UID: \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.065274 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-utilities\") pod \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\" (UID: \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.065428 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-catalog-content\") pod \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\" (UID: \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.066000 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9900be86-1923-4036-bccc-7e9c0484fb4c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.066402 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-utilities" (OuterVolumeSpecName: "utilities") pod "d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" (UID: "d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.069263 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-kube-api-access-ml6fk" (OuterVolumeSpecName: "kube-api-access-ml6fk") pod "d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" (UID: "d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a"). InnerVolumeSpecName "kube-api-access-ml6fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.128907 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" (UID: "d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.141237 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.167826 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b85kp"] Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.168120 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.168176 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.168194 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml6fk\" (UniqueName: \"kubernetes.io/projected/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-kube-api-access-ml6fk\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.184785 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b85kp"] Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.269378 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-libvirt-combined-ca-bundle\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.269420 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-nova-combined-ca-bundle\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.269452 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-ovn-default-certs-0\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.270299 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.270344 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-neutron-metadata-combined-ca-bundle\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.270381 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75mvx\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-kube-api-access-75mvx\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.270416 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-ovn-combined-ca-bundle\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.270688 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-bootstrap-combined-ca-bundle\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.271028 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.271112 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-inventory\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.271572 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-repo-setup-combined-ca-bundle\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.271667 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-telemetry-combined-ca-bundle\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.271755 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.273428 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-ssh-key-openstack-edpm-ipam\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.275294 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.275911 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.276173 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.278966 4837 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.279000 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.279029 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.279823 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.281903 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.292536 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-kube-api-access-75mvx" (OuterVolumeSpecName: "kube-api-access-75mvx") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "kube-api-access-75mvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.297314 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.297300 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.297964 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.298307 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.298492 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.298692 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.374151 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.374752 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-inventory" (OuterVolumeSpecName: "inventory") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.380534 4837 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.380578 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.380593 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.380602 4837 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.380613 4837 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.380623 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.380631 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.380680 4837 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.380688 4837 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.380698 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75mvx\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-kube-api-access-75mvx\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.380708 4837 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.748400 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" event={"ID":"6cc8d0dd-d1e6-4374-bb90-aaefc9197350","Type":"ContainerDied","Data":"a9a004fb6e650fe374173e7535e9f528dd1cc37af26ae43f015e8366167fa211"} Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.748453 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9a004fb6e650fe374173e7535e9f528dd1cc37af26ae43f015e8366167fa211" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.748543 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.753170 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsfbn" event={"ID":"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a","Type":"ContainerDied","Data":"9cdfda55cf58dcae44b171ff87d0d9876fe414823d1ec7d0b1b7ed1df6f59fe5"} Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.753236 4837 scope.go:117] "RemoveContainer" containerID="bfa2c9a9825342e83fc8ddcd92d9af1ebbe4e83ce8898f8cdf46504fa5d9b0b9" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.753234 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.785543 4837 scope.go:117] "RemoveContainer" containerID="50cefa7392ae5d869e28ff046f43607d22feea06cc84d09928cf9eb7cc27bc7c" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.805713 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tsfbn"] Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.807865 4837 scope.go:117] "RemoveContainer" containerID="adad9381bc6a39ddbdad6c4301cbb23bc8c90b91950618f3b2fe7fc956cf30c4" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.814846 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tsfbn"] Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.888351 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp"] Mar 13 12:19:32 crc kubenswrapper[4837]: E0313 12:19:32.888770 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cc8d0dd-d1e6-4374-bb90-aaefc9197350" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.888796 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cc8d0dd-d1e6-4374-bb90-aaefc9197350" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 13 12:19:32 crc kubenswrapper[4837]: E0313 12:19:32.888824 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" containerName="extract-content" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.888833 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" containerName="extract-content" Mar 13 12:19:32 crc kubenswrapper[4837]: E0313 12:19:32.888855 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" containerName="extract-utilities" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.888863 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" containerName="extract-utilities" Mar 13 12:19:32 crc kubenswrapper[4837]: E0313 12:19:32.888880 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9900be86-1923-4036-bccc-7e9c0484fb4c" containerName="extract-content" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.888887 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9900be86-1923-4036-bccc-7e9c0484fb4c" containerName="extract-content" Mar 13 12:19:32 crc kubenswrapper[4837]: E0313 12:19:32.888896 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" containerName="registry-server" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.888905 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" containerName="registry-server" Mar 13 12:19:32 crc kubenswrapper[4837]: E0313 12:19:32.888922 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9900be86-1923-4036-bccc-7e9c0484fb4c" containerName="registry-server" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.888931 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9900be86-1923-4036-bccc-7e9c0484fb4c" containerName="registry-server" Mar 13 12:19:32 crc kubenswrapper[4837]: E0313 12:19:32.888942 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9900be86-1923-4036-bccc-7e9c0484fb4c" containerName="extract-utilities" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.888950 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9900be86-1923-4036-bccc-7e9c0484fb4c" containerName="extract-utilities" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.889196 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" containerName="registry-server" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.889221 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9900be86-1923-4036-bccc-7e9c0484fb4c" containerName="registry-server" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.889238 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cc8d0dd-d1e6-4374-bb90-aaefc9197350" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.889990 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.892514 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.892920 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.893099 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.900530 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.900679 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.901343 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp"] Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.991863 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.992096 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.992170 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.992261 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v59h5\" (UniqueName: \"kubernetes.io/projected/092bd277-504a-450d-aca1-d8ecc18f0c9f-kube-api-access-v59h5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.992366 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/092bd277-504a-450d-aca1-d8ecc18f0c9f-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.059768 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9900be86-1923-4036-bccc-7e9c0484fb4c" path="/var/lib/kubelet/pods/9900be86-1923-4036-bccc-7e9c0484fb4c/volumes" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.060842 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" path="/var/lib/kubelet/pods/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a/volumes" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.094440 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.094557 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.094590 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.094617 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v59h5\" (UniqueName: \"kubernetes.io/projected/092bd277-504a-450d-aca1-d8ecc18f0c9f-kube-api-access-v59h5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.094666 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/092bd277-504a-450d-aca1-d8ecc18f0c9f-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.095884 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/092bd277-504a-450d-aca1-d8ecc18f0c9f-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.100753 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.100939 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.102371 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.116302 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v59h5\" (UniqueName: \"kubernetes.io/projected/092bd277-504a-450d-aca1-d8ecc18f0c9f-kube-api-access-v59h5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.230143 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:33 crc kubenswrapper[4837]: W0313 12:19:33.729287 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod092bd277_504a_450d_aca1_d8ecc18f0c9f.slice/crio-a99bf809e709f4093c6a3a9be8928febf788cb56e51d961891f138284d2fd35e WatchSource:0}: Error finding container a99bf809e709f4093c6a3a9be8928febf788cb56e51d961891f138284d2fd35e: Status 404 returned error can't find the container with id a99bf809e709f4093c6a3a9be8928febf788cb56e51d961891f138284d2fd35e Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.731846 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp"] Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.765101 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" event={"ID":"092bd277-504a-450d-aca1-d8ecc18f0c9f","Type":"ContainerStarted","Data":"a99bf809e709f4093c6a3a9be8928febf788cb56e51d961891f138284d2fd35e"} Mar 13 12:19:34 crc kubenswrapper[4837]: I0313 12:19:34.042754 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-mzmd5"] Mar 13 12:19:34 crc kubenswrapper[4837]: I0313 12:19:34.060664 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-mzmd5"] Mar 13 12:19:34 crc kubenswrapper[4837]: I0313 12:19:34.775843 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" event={"ID":"092bd277-504a-450d-aca1-d8ecc18f0c9f","Type":"ContainerStarted","Data":"77078cd73552b4fd4a97cf95b6976032937dfb766ff067aae032358f923a91d8"} Mar 13 12:19:34 crc kubenswrapper[4837]: I0313 12:19:34.798536 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" podStartSLOduration=2.383676351 podStartE2EDuration="2.798515959s" podCreationTimestamp="2026-03-13 12:19:32 +0000 UTC" firstStartedPulling="2026-03-13 12:19:33.731763962 +0000 UTC m=+1889.370030725" lastFinishedPulling="2026-03-13 12:19:34.14660357 +0000 UTC m=+1889.784870333" observedRunningTime="2026-03-13 12:19:34.792391947 +0000 UTC m=+1890.430658730" watchObservedRunningTime="2026-03-13 12:19:34.798515959 +0000 UTC m=+1890.436782722" Mar 13 12:19:35 crc kubenswrapper[4837]: I0313 12:19:35.060886 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0f45aae-caa3-4c50-9059-be42d328cba1" path="/var/lib/kubelet/pods/f0f45aae-caa3-4c50-9059-be42d328cba1/volumes" Mar 13 12:19:39 crc kubenswrapper[4837]: I0313 12:19:39.048178 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:19:39 crc kubenswrapper[4837]: I0313 12:19:39.819750 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"95ed8f8c7021ad56734ed8e8626e89cd8f2efcdf1bf9a33ce258f19439eeb037"} Mar 13 12:20:00 crc kubenswrapper[4837]: I0313 12:20:00.140814 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556740-snmw2"] Mar 13 12:20:00 crc kubenswrapper[4837]: I0313 12:20:00.142758 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556740-snmw2" Mar 13 12:20:00 crc kubenswrapper[4837]: I0313 12:20:00.144904 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:20:00 crc kubenswrapper[4837]: I0313 12:20:00.145030 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:20:00 crc kubenswrapper[4837]: I0313 12:20:00.146747 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:20:00 crc kubenswrapper[4837]: I0313 12:20:00.156222 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556740-snmw2"] Mar 13 12:20:00 crc kubenswrapper[4837]: I0313 12:20:00.309577 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j68n2\" (UniqueName: \"kubernetes.io/projected/e01710d7-a463-41fe-9d86-2410a8ccd8e8-kube-api-access-j68n2\") pod \"auto-csr-approver-29556740-snmw2\" (UID: \"e01710d7-a463-41fe-9d86-2410a8ccd8e8\") " pod="openshift-infra/auto-csr-approver-29556740-snmw2" Mar 13 12:20:00 crc kubenswrapper[4837]: I0313 12:20:00.412053 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j68n2\" (UniqueName: \"kubernetes.io/projected/e01710d7-a463-41fe-9d86-2410a8ccd8e8-kube-api-access-j68n2\") pod \"auto-csr-approver-29556740-snmw2\" (UID: \"e01710d7-a463-41fe-9d86-2410a8ccd8e8\") " pod="openshift-infra/auto-csr-approver-29556740-snmw2" Mar 13 12:20:00 crc kubenswrapper[4837]: I0313 12:20:00.434549 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j68n2\" (UniqueName: \"kubernetes.io/projected/e01710d7-a463-41fe-9d86-2410a8ccd8e8-kube-api-access-j68n2\") pod \"auto-csr-approver-29556740-snmw2\" (UID: \"e01710d7-a463-41fe-9d86-2410a8ccd8e8\") " pod="openshift-infra/auto-csr-approver-29556740-snmw2" Mar 13 12:20:00 crc kubenswrapper[4837]: I0313 12:20:00.481964 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556740-snmw2" Mar 13 12:20:00 crc kubenswrapper[4837]: I0313 12:20:00.923504 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556740-snmw2"] Mar 13 12:20:01 crc kubenswrapper[4837]: I0313 12:20:01.002987 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556740-snmw2" event={"ID":"e01710d7-a463-41fe-9d86-2410a8ccd8e8","Type":"ContainerStarted","Data":"2872cecee5e2227b12f3a548445c20274e30fe510d83ffd4afcd24e93795e826"} Mar 13 12:20:03 crc kubenswrapper[4837]: I0313 12:20:03.025356 4837 generic.go:334] "Generic (PLEG): container finished" podID="e01710d7-a463-41fe-9d86-2410a8ccd8e8" containerID="5c53da3a56d1c8f877bdab4d65362dc1a8c31f8cd4991718456d0c1946898d66" exitCode=0 Mar 13 12:20:03 crc kubenswrapper[4837]: I0313 12:20:03.025496 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556740-snmw2" event={"ID":"e01710d7-a463-41fe-9d86-2410a8ccd8e8","Type":"ContainerDied","Data":"5c53da3a56d1c8f877bdab4d65362dc1a8c31f8cd4991718456d0c1946898d66"} Mar 13 12:20:04 crc kubenswrapper[4837]: I0313 12:20:04.354534 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556740-snmw2" Mar 13 12:20:04 crc kubenswrapper[4837]: I0313 12:20:04.489075 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j68n2\" (UniqueName: \"kubernetes.io/projected/e01710d7-a463-41fe-9d86-2410a8ccd8e8-kube-api-access-j68n2\") pod \"e01710d7-a463-41fe-9d86-2410a8ccd8e8\" (UID: \"e01710d7-a463-41fe-9d86-2410a8ccd8e8\") " Mar 13 12:20:04 crc kubenswrapper[4837]: I0313 12:20:04.494152 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e01710d7-a463-41fe-9d86-2410a8ccd8e8-kube-api-access-j68n2" (OuterVolumeSpecName: "kube-api-access-j68n2") pod "e01710d7-a463-41fe-9d86-2410a8ccd8e8" (UID: "e01710d7-a463-41fe-9d86-2410a8ccd8e8"). InnerVolumeSpecName "kube-api-access-j68n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:20:04 crc kubenswrapper[4837]: I0313 12:20:04.590979 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j68n2\" (UniqueName: \"kubernetes.io/projected/e01710d7-a463-41fe-9d86-2410a8ccd8e8-kube-api-access-j68n2\") on node \"crc\" DevicePath \"\"" Mar 13 12:20:05 crc kubenswrapper[4837]: I0313 12:20:05.047978 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556740-snmw2" Mar 13 12:20:05 crc kubenswrapper[4837]: I0313 12:20:05.065198 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556740-snmw2" event={"ID":"e01710d7-a463-41fe-9d86-2410a8ccd8e8","Type":"ContainerDied","Data":"2872cecee5e2227b12f3a548445c20274e30fe510d83ffd4afcd24e93795e826"} Mar 13 12:20:05 crc kubenswrapper[4837]: I0313 12:20:05.065257 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2872cecee5e2227b12f3a548445c20274e30fe510d83ffd4afcd24e93795e826" Mar 13 12:20:05 crc kubenswrapper[4837]: I0313 12:20:05.416455 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556734-g7zt7"] Mar 13 12:20:05 crc kubenswrapper[4837]: I0313 12:20:05.424864 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556734-g7zt7"] Mar 13 12:20:07 crc kubenswrapper[4837]: I0313 12:20:07.059487 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b41b916d-46ab-43e8-b624-bb1fb6aaf2f8" path="/var/lib/kubelet/pods/b41b916d-46ab-43e8-b624-bb1fb6aaf2f8/volumes" Mar 13 12:20:12 crc kubenswrapper[4837]: E0313 12:20:12.185795 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode01710d7_a463_41fe_9d86_2410a8ccd8e8.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:20:22 crc kubenswrapper[4837]: E0313 12:20:22.431918 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode01710d7_a463_41fe_9d86_2410a8ccd8e8.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:20:28 crc kubenswrapper[4837]: I0313 12:20:28.249398 4837 generic.go:334] "Generic (PLEG): container finished" podID="092bd277-504a-450d-aca1-d8ecc18f0c9f" containerID="77078cd73552b4fd4a97cf95b6976032937dfb766ff067aae032358f923a91d8" exitCode=0 Mar 13 12:20:28 crc kubenswrapper[4837]: I0313 12:20:28.249470 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" event={"ID":"092bd277-504a-450d-aca1-d8ecc18f0c9f","Type":"ContainerDied","Data":"77078cd73552b4fd4a97cf95b6976032937dfb766ff067aae032358f923a91d8"} Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.656169 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.678202 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/092bd277-504a-450d-aca1-d8ecc18f0c9f-ovncontroller-config-0\") pod \"092bd277-504a-450d-aca1-d8ecc18f0c9f\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.678269 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-ovn-combined-ca-bundle\") pod \"092bd277-504a-450d-aca1-d8ecc18f0c9f\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.678318 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-inventory\") pod \"092bd277-504a-450d-aca1-d8ecc18f0c9f\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.678355 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-ssh-key-openstack-edpm-ipam\") pod \"092bd277-504a-450d-aca1-d8ecc18f0c9f\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.678395 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v59h5\" (UniqueName: \"kubernetes.io/projected/092bd277-504a-450d-aca1-d8ecc18f0c9f-kube-api-access-v59h5\") pod \"092bd277-504a-450d-aca1-d8ecc18f0c9f\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.684812 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/092bd277-504a-450d-aca1-d8ecc18f0c9f-kube-api-access-v59h5" (OuterVolumeSpecName: "kube-api-access-v59h5") pod "092bd277-504a-450d-aca1-d8ecc18f0c9f" (UID: "092bd277-504a-450d-aca1-d8ecc18f0c9f"). InnerVolumeSpecName "kube-api-access-v59h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.684828 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "092bd277-504a-450d-aca1-d8ecc18f0c9f" (UID: "092bd277-504a-450d-aca1-d8ecc18f0c9f"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.710090 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-inventory" (OuterVolumeSpecName: "inventory") pod "092bd277-504a-450d-aca1-d8ecc18f0c9f" (UID: "092bd277-504a-450d-aca1-d8ecc18f0c9f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.713828 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "092bd277-504a-450d-aca1-d8ecc18f0c9f" (UID: "092bd277-504a-450d-aca1-d8ecc18f0c9f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.715221 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/092bd277-504a-450d-aca1-d8ecc18f0c9f-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "092bd277-504a-450d-aca1-d8ecc18f0c9f" (UID: "092bd277-504a-450d-aca1-d8ecc18f0c9f"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.780061 4837 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/092bd277-504a-450d-aca1-d8ecc18f0c9f-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.780105 4837 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.780118 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.780129 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.780142 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v59h5\" (UniqueName: \"kubernetes.io/projected/092bd277-504a-450d-aca1-d8ecc18f0c9f-kube-api-access-v59h5\") on node \"crc\" DevicePath \"\"" Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.948611 4837 scope.go:117] "RemoveContainer" containerID="e540ca1787fcba1ed1f9804f4336a11c9388c115ed0bc76404d559071e68ab56" Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.997853 4837 scope.go:117] "RemoveContainer" containerID="618f29cef46a018933eff3564372eb6b93270ae38a4b8bb52de53e9e241ebfba" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.269969 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" event={"ID":"092bd277-504a-450d-aca1-d8ecc18f0c9f","Type":"ContainerDied","Data":"a99bf809e709f4093c6a3a9be8928febf788cb56e51d961891f138284d2fd35e"} Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.270250 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a99bf809e709f4093c6a3a9be8928febf788cb56e51d961891f138284d2fd35e" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.270037 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.352831 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4"] Mar 13 12:20:30 crc kubenswrapper[4837]: E0313 12:20:30.353289 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="092bd277-504a-450d-aca1-d8ecc18f0c9f" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.353309 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="092bd277-504a-450d-aca1-d8ecc18f0c9f" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 13 12:20:30 crc kubenswrapper[4837]: E0313 12:20:30.353327 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e01710d7-a463-41fe-9d86-2410a8ccd8e8" containerName="oc" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.353336 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e01710d7-a463-41fe-9d86-2410a8ccd8e8" containerName="oc" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.353582 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e01710d7-a463-41fe-9d86-2410a8ccd8e8" containerName="oc" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.353600 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="092bd277-504a-450d-aca1-d8ecc18f0c9f" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.354536 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.360906 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.360944 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.360959 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.361108 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.361850 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.362430 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.369547 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4"] Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.391279 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59kht\" (UniqueName: \"kubernetes.io/projected/20f35066-9c10-4433-a655-f5cef18d4deb-kube-api-access-59kht\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.391373 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.391452 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.391692 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.391854 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.391974 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.494051 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59kht\" (UniqueName: \"kubernetes.io/projected/20f35066-9c10-4433-a655-f5cef18d4deb-kube-api-access-59kht\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.494146 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.494202 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.494316 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.494408 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.494484 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.498333 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.498551 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.498901 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.499676 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.500760 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.512957 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59kht\" (UniqueName: \"kubernetes.io/projected/20f35066-9c10-4433-a655-f5cef18d4deb-kube-api-access-59kht\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.671543 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:31 crc kubenswrapper[4837]: I0313 12:20:31.173870 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4"] Mar 13 12:20:31 crc kubenswrapper[4837]: I0313 12:20:31.281166 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" event={"ID":"20f35066-9c10-4433-a655-f5cef18d4deb","Type":"ContainerStarted","Data":"904b0b4d824437fa7194e901c77da3b777325b0d631b66698c9a20c59c99d938"} Mar 13 12:20:32 crc kubenswrapper[4837]: I0313 12:20:32.290041 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" event={"ID":"20f35066-9c10-4433-a655-f5cef18d4deb","Type":"ContainerStarted","Data":"ff8212009c342279b5e1961bf82567e4bb8b1fc5a57b88231787fdcfc37b919c"} Mar 13 12:20:32 crc kubenswrapper[4837]: I0313 12:20:32.313842 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" podStartSLOduration=1.804728444 podStartE2EDuration="2.313825724s" podCreationTimestamp="2026-03-13 12:20:30 +0000 UTC" firstStartedPulling="2026-03-13 12:20:31.182202328 +0000 UTC m=+1946.820469091" lastFinishedPulling="2026-03-13 12:20:31.691299608 +0000 UTC m=+1947.329566371" observedRunningTime="2026-03-13 12:20:32.312933745 +0000 UTC m=+1947.951200508" watchObservedRunningTime="2026-03-13 12:20:32.313825724 +0000 UTC m=+1947.952092487" Mar 13 12:20:32 crc kubenswrapper[4837]: E0313 12:20:32.675921 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode01710d7_a463_41fe_9d86_2410a8ccd8e8.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:20:42 crc kubenswrapper[4837]: E0313 12:20:42.914286 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode01710d7_a463_41fe_9d86_2410a8ccd8e8.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:20:53 crc kubenswrapper[4837]: E0313 12:20:53.149899 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode01710d7_a463_41fe_9d86_2410a8ccd8e8.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:21:03 crc kubenswrapper[4837]: E0313 12:21:03.378474 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode01710d7_a463_41fe_9d86_2410a8ccd8e8.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:21:13 crc kubenswrapper[4837]: I0313 12:21:13.625022 4837 generic.go:334] "Generic (PLEG): container finished" podID="20f35066-9c10-4433-a655-f5cef18d4deb" containerID="ff8212009c342279b5e1961bf82567e4bb8b1fc5a57b88231787fdcfc37b919c" exitCode=0 Mar 13 12:21:13 crc kubenswrapper[4837]: I0313 12:21:13.625102 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" event={"ID":"20f35066-9c10-4433-a655-f5cef18d4deb","Type":"ContainerDied","Data":"ff8212009c342279b5e1961bf82567e4bb8b1fc5a57b88231787fdcfc37b919c"} Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.059942 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.164958 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-nova-metadata-neutron-config-0\") pod \"20f35066-9c10-4433-a655-f5cef18d4deb\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.165037 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-neutron-metadata-combined-ca-bundle\") pod \"20f35066-9c10-4433-a655-f5cef18d4deb\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.165082 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59kht\" (UniqueName: \"kubernetes.io/projected/20f35066-9c10-4433-a655-f5cef18d4deb-kube-api-access-59kht\") pod \"20f35066-9c10-4433-a655-f5cef18d4deb\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.165113 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-ssh-key-openstack-edpm-ipam\") pod \"20f35066-9c10-4433-a655-f5cef18d4deb\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.165864 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"20f35066-9c10-4433-a655-f5cef18d4deb\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.166036 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-inventory\") pod \"20f35066-9c10-4433-a655-f5cef18d4deb\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.171992 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20f35066-9c10-4433-a655-f5cef18d4deb-kube-api-access-59kht" (OuterVolumeSpecName: "kube-api-access-59kht") pod "20f35066-9c10-4433-a655-f5cef18d4deb" (UID: "20f35066-9c10-4433-a655-f5cef18d4deb"). InnerVolumeSpecName "kube-api-access-59kht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.183119 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "20f35066-9c10-4433-a655-f5cef18d4deb" (UID: "20f35066-9c10-4433-a655-f5cef18d4deb"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.195389 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "20f35066-9c10-4433-a655-f5cef18d4deb" (UID: "20f35066-9c10-4433-a655-f5cef18d4deb"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.197462 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "20f35066-9c10-4433-a655-f5cef18d4deb" (UID: "20f35066-9c10-4433-a655-f5cef18d4deb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.202621 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "20f35066-9c10-4433-a655-f5cef18d4deb" (UID: "20f35066-9c10-4433-a655-f5cef18d4deb"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.207326 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-inventory" (OuterVolumeSpecName: "inventory") pod "20f35066-9c10-4433-a655-f5cef18d4deb" (UID: "20f35066-9c10-4433-a655-f5cef18d4deb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.269002 4837 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.269031 4837 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.269068 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59kht\" (UniqueName: \"kubernetes.io/projected/20f35066-9c10-4433-a655-f5cef18d4deb-kube-api-access-59kht\") on node \"crc\" DevicePath \"\"" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.269078 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.269088 4837 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.269100 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.640682 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" event={"ID":"20f35066-9c10-4433-a655-f5cef18d4deb","Type":"ContainerDied","Data":"904b0b4d824437fa7194e901c77da3b777325b0d631b66698c9a20c59c99d938"} Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.640725 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="904b0b4d824437fa7194e901c77da3b777325b0d631b66698c9a20c59c99d938" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.640734 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.766165 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5"] Mar 13 12:21:15 crc kubenswrapper[4837]: E0313 12:21:15.766744 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20f35066-9c10-4433-a655-f5cef18d4deb" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.766847 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f35066-9c10-4433-a655-f5cef18d4deb" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.767075 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="20f35066-9c10-4433-a655-f5cef18d4deb" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.770391 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.777736 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.778082 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.778526 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.781510 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.782719 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.788396 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5"] Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.895871 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.895926 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.895974 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.896308 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.896389 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwdjz\" (UniqueName: \"kubernetes.io/projected/394104d4-0291-4071-a7da-d7b71e0f4083-kube-api-access-zwdjz\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.998748 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.998808 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwdjz\" (UniqueName: \"kubernetes.io/projected/394104d4-0291-4071-a7da-d7b71e0f4083-kube-api-access-zwdjz\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.998930 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.998959 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.998998 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:16 crc kubenswrapper[4837]: I0313 12:21:16.002995 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:16 crc kubenswrapper[4837]: I0313 12:21:16.004072 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:16 crc kubenswrapper[4837]: I0313 12:21:16.005037 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:16 crc kubenswrapper[4837]: I0313 12:21:16.007165 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:16 crc kubenswrapper[4837]: I0313 12:21:16.019444 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwdjz\" (UniqueName: \"kubernetes.io/projected/394104d4-0291-4071-a7da-d7b71e0f4083-kube-api-access-zwdjz\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:16 crc kubenswrapper[4837]: I0313 12:21:16.096999 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:16 crc kubenswrapper[4837]: I0313 12:21:16.638566 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5"] Mar 13 12:21:17 crc kubenswrapper[4837]: I0313 12:21:17.659399 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" event={"ID":"394104d4-0291-4071-a7da-d7b71e0f4083","Type":"ContainerStarted","Data":"f30e37b9f7f0384121aa71f44589ddb9d3068a703ce922ccc922c61ff88b1f38"} Mar 13 12:21:17 crc kubenswrapper[4837]: I0313 12:21:17.660756 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" event={"ID":"394104d4-0291-4071-a7da-d7b71e0f4083","Type":"ContainerStarted","Data":"635868923cf7f5008b52abe367a7a6d82aa47f6efef93a5cafc25c193c32e1e5"} Mar 13 12:21:17 crc kubenswrapper[4837]: I0313 12:21:17.677997 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" podStartSLOduration=1.9677041119999998 podStartE2EDuration="2.677975126s" podCreationTimestamp="2026-03-13 12:21:15 +0000 UTC" firstStartedPulling="2026-03-13 12:21:16.64434545 +0000 UTC m=+1992.282612213" lastFinishedPulling="2026-03-13 12:21:17.354616464 +0000 UTC m=+1992.992883227" observedRunningTime="2026-03-13 12:21:17.672218575 +0000 UTC m=+1993.310485338" watchObservedRunningTime="2026-03-13 12:21:17.677975126 +0000 UTC m=+1993.316241889" Mar 13 12:22:00 crc kubenswrapper[4837]: I0313 12:22:00.140313 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556742-5ggnq"] Mar 13 12:22:00 crc kubenswrapper[4837]: I0313 12:22:00.142416 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556742-5ggnq" Mar 13 12:22:00 crc kubenswrapper[4837]: I0313 12:22:00.144789 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:22:00 crc kubenswrapper[4837]: I0313 12:22:00.145011 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:22:00 crc kubenswrapper[4837]: I0313 12:22:00.145672 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:22:00 crc kubenswrapper[4837]: I0313 12:22:00.149303 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556742-5ggnq"] Mar 13 12:22:00 crc kubenswrapper[4837]: I0313 12:22:00.282232 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf6v4\" (UniqueName: \"kubernetes.io/projected/aed6dbbf-3a09-4b60-9757-7c74a07f9c63-kube-api-access-qf6v4\") pod \"auto-csr-approver-29556742-5ggnq\" (UID: \"aed6dbbf-3a09-4b60-9757-7c74a07f9c63\") " pod="openshift-infra/auto-csr-approver-29556742-5ggnq" Mar 13 12:22:00 crc kubenswrapper[4837]: I0313 12:22:00.384094 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf6v4\" (UniqueName: \"kubernetes.io/projected/aed6dbbf-3a09-4b60-9757-7c74a07f9c63-kube-api-access-qf6v4\") pod \"auto-csr-approver-29556742-5ggnq\" (UID: \"aed6dbbf-3a09-4b60-9757-7c74a07f9c63\") " pod="openshift-infra/auto-csr-approver-29556742-5ggnq" Mar 13 12:22:00 crc kubenswrapper[4837]: I0313 12:22:00.412533 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf6v4\" (UniqueName: \"kubernetes.io/projected/aed6dbbf-3a09-4b60-9757-7c74a07f9c63-kube-api-access-qf6v4\") pod \"auto-csr-approver-29556742-5ggnq\" (UID: \"aed6dbbf-3a09-4b60-9757-7c74a07f9c63\") " pod="openshift-infra/auto-csr-approver-29556742-5ggnq" Mar 13 12:22:00 crc kubenswrapper[4837]: I0313 12:22:00.500410 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556742-5ggnq" Mar 13 12:22:00 crc kubenswrapper[4837]: I0313 12:22:00.923990 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556742-5ggnq"] Mar 13 12:22:01 crc kubenswrapper[4837]: I0313 12:22:01.038326 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556742-5ggnq" event={"ID":"aed6dbbf-3a09-4b60-9757-7c74a07f9c63","Type":"ContainerStarted","Data":"58ec7ab55c0c919cae79f2a8321f6600f4d542955baab0750f8a171f55c53c13"} Mar 13 12:22:03 crc kubenswrapper[4837]: I0313 12:22:03.065380 4837 generic.go:334] "Generic (PLEG): container finished" podID="aed6dbbf-3a09-4b60-9757-7c74a07f9c63" containerID="852beb2b4218c3ee146b9596afb327ce3ec642be20ae0116d12166c03475804d" exitCode=0 Mar 13 12:22:03 crc kubenswrapper[4837]: I0313 12:22:03.066410 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556742-5ggnq" event={"ID":"aed6dbbf-3a09-4b60-9757-7c74a07f9c63","Type":"ContainerDied","Data":"852beb2b4218c3ee146b9596afb327ce3ec642be20ae0116d12166c03475804d"} Mar 13 12:22:04 crc kubenswrapper[4837]: I0313 12:22:04.442848 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556742-5ggnq" Mar 13 12:22:04 crc kubenswrapper[4837]: I0313 12:22:04.487742 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf6v4\" (UniqueName: \"kubernetes.io/projected/aed6dbbf-3a09-4b60-9757-7c74a07f9c63-kube-api-access-qf6v4\") pod \"aed6dbbf-3a09-4b60-9757-7c74a07f9c63\" (UID: \"aed6dbbf-3a09-4b60-9757-7c74a07f9c63\") " Mar 13 12:22:04 crc kubenswrapper[4837]: I0313 12:22:04.495872 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aed6dbbf-3a09-4b60-9757-7c74a07f9c63-kube-api-access-qf6v4" (OuterVolumeSpecName: "kube-api-access-qf6v4") pod "aed6dbbf-3a09-4b60-9757-7c74a07f9c63" (UID: "aed6dbbf-3a09-4b60-9757-7c74a07f9c63"). InnerVolumeSpecName "kube-api-access-qf6v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:22:04 crc kubenswrapper[4837]: I0313 12:22:04.589814 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qf6v4\" (UniqueName: \"kubernetes.io/projected/aed6dbbf-3a09-4b60-9757-7c74a07f9c63-kube-api-access-qf6v4\") on node \"crc\" DevicePath \"\"" Mar 13 12:22:05 crc kubenswrapper[4837]: I0313 12:22:05.084540 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556742-5ggnq" event={"ID":"aed6dbbf-3a09-4b60-9757-7c74a07f9c63","Type":"ContainerDied","Data":"58ec7ab55c0c919cae79f2a8321f6600f4d542955baab0750f8a171f55c53c13"} Mar 13 12:22:05 crc kubenswrapper[4837]: I0313 12:22:05.084583 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58ec7ab55c0c919cae79f2a8321f6600f4d542955baab0750f8a171f55c53c13" Mar 13 12:22:05 crc kubenswrapper[4837]: I0313 12:22:05.084649 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556742-5ggnq" Mar 13 12:22:05 crc kubenswrapper[4837]: I0313 12:22:05.483474 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:22:05 crc kubenswrapper[4837]: I0313 12:22:05.483866 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:22:05 crc kubenswrapper[4837]: I0313 12:22:05.506150 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556736-26kwx"] Mar 13 12:22:05 crc kubenswrapper[4837]: I0313 12:22:05.514140 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556736-26kwx"] Mar 13 12:22:07 crc kubenswrapper[4837]: I0313 12:22:07.061517 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2c1518c-d031-4597-ab77-8626e068bcda" path="/var/lib/kubelet/pods/a2c1518c-d031-4597-ab77-8626e068bcda/volumes" Mar 13 12:22:14 crc kubenswrapper[4837]: E0313 12:22:14.984337 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaed6dbbf_3a09_4b60_9757_7c74a07f9c63.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:22:25 crc kubenswrapper[4837]: E0313 12:22:25.225122 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaed6dbbf_3a09_4b60_9757_7c74a07f9c63.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:22:30 crc kubenswrapper[4837]: I0313 12:22:30.140106 4837 scope.go:117] "RemoveContainer" containerID="eab36df7c6a9acf9dc7560368f9674c4b5510068e382ff493b327a540b10eb38" Mar 13 12:22:35 crc kubenswrapper[4837]: E0313 12:22:35.453842 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaed6dbbf_3a09_4b60_9757_7c74a07f9c63.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:22:35 crc kubenswrapper[4837]: I0313 12:22:35.483527 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:22:35 crc kubenswrapper[4837]: I0313 12:22:35.483602 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:22:45 crc kubenswrapper[4837]: E0313 12:22:45.680125 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaed6dbbf_3a09_4b60_9757_7c74a07f9c63.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:22:55 crc kubenswrapper[4837]: E0313 12:22:55.911702 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaed6dbbf_3a09_4b60_9757_7c74a07f9c63.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:23:05 crc kubenswrapper[4837]: I0313 12:23:05.484091 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:23:05 crc kubenswrapper[4837]: I0313 12:23:05.484559 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:23:05 crc kubenswrapper[4837]: I0313 12:23:05.484609 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 12:23:05 crc kubenswrapper[4837]: I0313 12:23:05.485329 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"95ed8f8c7021ad56734ed8e8626e89cd8f2efcdf1bf9a33ce258f19439eeb037"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:23:05 crc kubenswrapper[4837]: I0313 12:23:05.485381 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://95ed8f8c7021ad56734ed8e8626e89cd8f2efcdf1bf9a33ce258f19439eeb037" gracePeriod=600 Mar 13 12:23:05 crc kubenswrapper[4837]: I0313 12:23:05.614861 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="95ed8f8c7021ad56734ed8e8626e89cd8f2efcdf1bf9a33ce258f19439eeb037" exitCode=0 Mar 13 12:23:05 crc kubenswrapper[4837]: I0313 12:23:05.614914 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"95ed8f8c7021ad56734ed8e8626e89cd8f2efcdf1bf9a33ce258f19439eeb037"} Mar 13 12:23:05 crc kubenswrapper[4837]: I0313 12:23:05.614954 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:23:06 crc kubenswrapper[4837]: I0313 12:23:06.626410 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504"} Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.149911 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556744-m59fh"] Mar 13 12:24:00 crc kubenswrapper[4837]: E0313 12:24:00.151085 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aed6dbbf-3a09-4b60-9757-7c74a07f9c63" containerName="oc" Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.151108 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="aed6dbbf-3a09-4b60-9757-7c74a07f9c63" containerName="oc" Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.151484 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="aed6dbbf-3a09-4b60-9757-7c74a07f9c63" containerName="oc" Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.152379 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556744-m59fh" Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.154653 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.158335 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.158669 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.162214 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556744-m59fh"] Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.302407 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjls7\" (UniqueName: \"kubernetes.io/projected/1e934250-1bb4-41fe-b36e-2acf48194bcf-kube-api-access-fjls7\") pod \"auto-csr-approver-29556744-m59fh\" (UID: \"1e934250-1bb4-41fe-b36e-2acf48194bcf\") " pod="openshift-infra/auto-csr-approver-29556744-m59fh" Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.405352 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjls7\" (UniqueName: \"kubernetes.io/projected/1e934250-1bb4-41fe-b36e-2acf48194bcf-kube-api-access-fjls7\") pod \"auto-csr-approver-29556744-m59fh\" (UID: \"1e934250-1bb4-41fe-b36e-2acf48194bcf\") " pod="openshift-infra/auto-csr-approver-29556744-m59fh" Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.423875 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjls7\" (UniqueName: \"kubernetes.io/projected/1e934250-1bb4-41fe-b36e-2acf48194bcf-kube-api-access-fjls7\") pod \"auto-csr-approver-29556744-m59fh\" (UID: \"1e934250-1bb4-41fe-b36e-2acf48194bcf\") " pod="openshift-infra/auto-csr-approver-29556744-m59fh" Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.475131 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556744-m59fh" Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.912124 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556744-m59fh"] Mar 13 12:24:01 crc kubenswrapper[4837]: I0313 12:24:01.156983 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556744-m59fh" event={"ID":"1e934250-1bb4-41fe-b36e-2acf48194bcf","Type":"ContainerStarted","Data":"344af983d8397c15688256e604b383f3dd8ba0e599135f66bdd75e3c171eca4b"} Mar 13 12:24:03 crc kubenswrapper[4837]: I0313 12:24:03.176859 4837 generic.go:334] "Generic (PLEG): container finished" podID="1e934250-1bb4-41fe-b36e-2acf48194bcf" containerID="5d6f6eb18121de7ac4f3538b881026fd87404ecae22fe7e8d631b874d26990e4" exitCode=0 Mar 13 12:24:03 crc kubenswrapper[4837]: I0313 12:24:03.176941 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556744-m59fh" event={"ID":"1e934250-1bb4-41fe-b36e-2acf48194bcf","Type":"ContainerDied","Data":"5d6f6eb18121de7ac4f3538b881026fd87404ecae22fe7e8d631b874d26990e4"} Mar 13 12:24:04 crc kubenswrapper[4837]: I0313 12:24:04.504918 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556744-m59fh" Mar 13 12:24:04 crc kubenswrapper[4837]: I0313 12:24:04.694482 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjls7\" (UniqueName: \"kubernetes.io/projected/1e934250-1bb4-41fe-b36e-2acf48194bcf-kube-api-access-fjls7\") pod \"1e934250-1bb4-41fe-b36e-2acf48194bcf\" (UID: \"1e934250-1bb4-41fe-b36e-2acf48194bcf\") " Mar 13 12:24:04 crc kubenswrapper[4837]: I0313 12:24:04.702180 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e934250-1bb4-41fe-b36e-2acf48194bcf-kube-api-access-fjls7" (OuterVolumeSpecName: "kube-api-access-fjls7") pod "1e934250-1bb4-41fe-b36e-2acf48194bcf" (UID: "1e934250-1bb4-41fe-b36e-2acf48194bcf"). InnerVolumeSpecName "kube-api-access-fjls7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:24:04 crc kubenswrapper[4837]: I0313 12:24:04.797150 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjls7\" (UniqueName: \"kubernetes.io/projected/1e934250-1bb4-41fe-b36e-2acf48194bcf-kube-api-access-fjls7\") on node \"crc\" DevicePath \"\"" Mar 13 12:24:05 crc kubenswrapper[4837]: I0313 12:24:05.197944 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556744-m59fh" event={"ID":"1e934250-1bb4-41fe-b36e-2acf48194bcf","Type":"ContainerDied","Data":"344af983d8397c15688256e604b383f3dd8ba0e599135f66bdd75e3c171eca4b"} Mar 13 12:24:05 crc kubenswrapper[4837]: I0313 12:24:05.198297 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="344af983d8397c15688256e604b383f3dd8ba0e599135f66bdd75e3c171eca4b" Mar 13 12:24:05 crc kubenswrapper[4837]: I0313 12:24:05.197988 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556744-m59fh" Mar 13 12:24:05 crc kubenswrapper[4837]: I0313 12:24:05.572133 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556738-trdfn"] Mar 13 12:24:05 crc kubenswrapper[4837]: I0313 12:24:05.581831 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556738-trdfn"] Mar 13 12:24:07 crc kubenswrapper[4837]: I0313 12:24:07.058889 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abf39778-b981-4807-916d-f62ff0a03ac9" path="/var/lib/kubelet/pods/abf39778-b981-4807-916d-f62ff0a03ac9/volumes" Mar 13 12:24:30 crc kubenswrapper[4837]: I0313 12:24:30.265161 4837 scope.go:117] "RemoveContainer" containerID="7e866ef5a9a2608fd8aa30e6d573f07172996e7b068a978cf3d3449b179bd748" Mar 13 12:24:42 crc kubenswrapper[4837]: I0313 12:24:42.512850 4837 generic.go:334] "Generic (PLEG): container finished" podID="394104d4-0291-4071-a7da-d7b71e0f4083" containerID="f30e37b9f7f0384121aa71f44589ddb9d3068a703ce922ccc922c61ff88b1f38" exitCode=0 Mar 13 12:24:42 crc kubenswrapper[4837]: I0313 12:24:42.512956 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" event={"ID":"394104d4-0291-4071-a7da-d7b71e0f4083","Type":"ContainerDied","Data":"f30e37b9f7f0384121aa71f44589ddb9d3068a703ce922ccc922c61ff88b1f38"} Mar 13 12:24:43 crc kubenswrapper[4837]: I0313 12:24:43.903946 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.073091 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-libvirt-combined-ca-bundle\") pod \"394104d4-0291-4071-a7da-d7b71e0f4083\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.073208 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwdjz\" (UniqueName: \"kubernetes.io/projected/394104d4-0291-4071-a7da-d7b71e0f4083-kube-api-access-zwdjz\") pod \"394104d4-0291-4071-a7da-d7b71e0f4083\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.073231 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-libvirt-secret-0\") pod \"394104d4-0291-4071-a7da-d7b71e0f4083\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.073335 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-inventory\") pod \"394104d4-0291-4071-a7da-d7b71e0f4083\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.073369 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-ssh-key-openstack-edpm-ipam\") pod \"394104d4-0291-4071-a7da-d7b71e0f4083\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.079502 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/394104d4-0291-4071-a7da-d7b71e0f4083-kube-api-access-zwdjz" (OuterVolumeSpecName: "kube-api-access-zwdjz") pod "394104d4-0291-4071-a7da-d7b71e0f4083" (UID: "394104d4-0291-4071-a7da-d7b71e0f4083"). InnerVolumeSpecName "kube-api-access-zwdjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.081364 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "394104d4-0291-4071-a7da-d7b71e0f4083" (UID: "394104d4-0291-4071-a7da-d7b71e0f4083"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.106290 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "394104d4-0291-4071-a7da-d7b71e0f4083" (UID: "394104d4-0291-4071-a7da-d7b71e0f4083"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.109495 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-inventory" (OuterVolumeSpecName: "inventory") pod "394104d4-0291-4071-a7da-d7b71e0f4083" (UID: "394104d4-0291-4071-a7da-d7b71e0f4083"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.110540 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "394104d4-0291-4071-a7da-d7b71e0f4083" (UID: "394104d4-0291-4071-a7da-d7b71e0f4083"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.175631 4837 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.175687 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwdjz\" (UniqueName: \"kubernetes.io/projected/394104d4-0291-4071-a7da-d7b71e0f4083-kube-api-access-zwdjz\") on node \"crc\" DevicePath \"\"" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.175714 4837 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.175736 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.175750 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.541779 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" event={"ID":"394104d4-0291-4071-a7da-d7b71e0f4083","Type":"ContainerDied","Data":"635868923cf7f5008b52abe367a7a6d82aa47f6efef93a5cafc25c193c32e1e5"} Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.541873 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="635868923cf7f5008b52abe367a7a6d82aa47f6efef93a5cafc25c193c32e1e5" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.542155 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.632110 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk"] Mar 13 12:24:44 crc kubenswrapper[4837]: E0313 12:24:44.632769 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="394104d4-0291-4071-a7da-d7b71e0f4083" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.632843 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="394104d4-0291-4071-a7da-d7b71e0f4083" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 13 12:24:44 crc kubenswrapper[4837]: E0313 12:24:44.632903 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e934250-1bb4-41fe-b36e-2acf48194bcf" containerName="oc" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.632980 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e934250-1bb4-41fe-b36e-2acf48194bcf" containerName="oc" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.633204 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e934250-1bb4-41fe-b36e-2acf48194bcf" containerName="oc" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.633271 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="394104d4-0291-4071-a7da-d7b71e0f4083" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.634014 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.636096 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.636416 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.636674 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.638126 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.638281 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.638449 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.638583 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.660165 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk"] Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.787086 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.787144 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.787163 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.787252 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.787273 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e6986f16-e143-49f4-81e5-58abba717876-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.787292 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.787322 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.787371 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzqgf\" (UniqueName: \"kubernetes.io/projected/e6986f16-e143-49f4-81e5-58abba717876-kube-api-access-fzqgf\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.787435 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.787451 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.787468 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.888938 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.888986 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e6986f16-e143-49f4-81e5-58abba717876-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.889007 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.889040 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.889088 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzqgf\" (UniqueName: \"kubernetes.io/projected/e6986f16-e143-49f4-81e5-58abba717876-kube-api-access-fzqgf\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.889126 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.889142 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.889157 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.889195 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.889234 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.889250 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.894149 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.894549 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e6986f16-e143-49f4-81e5-58abba717876-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.895425 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.897293 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.897626 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.897911 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.898131 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.902356 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.902990 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.914193 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.919255 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzqgf\" (UniqueName: \"kubernetes.io/projected/e6986f16-e143-49f4-81e5-58abba717876-kube-api-access-fzqgf\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.961872 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:45 crc kubenswrapper[4837]: I0313 12:24:45.517263 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk"] Mar 13 12:24:45 crc kubenswrapper[4837]: W0313 12:24:45.524523 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6986f16_e143_49f4_81e5_58abba717876.slice/crio-7cc82d3fd71f5170b65dc07b84d25573c7febed498176a66fed2dfd3b4619643 WatchSource:0}: Error finding container 7cc82d3fd71f5170b65dc07b84d25573c7febed498176a66fed2dfd3b4619643: Status 404 returned error can't find the container with id 7cc82d3fd71f5170b65dc07b84d25573c7febed498176a66fed2dfd3b4619643 Mar 13 12:24:45 crc kubenswrapper[4837]: I0313 12:24:45.528090 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:24:45 crc kubenswrapper[4837]: I0313 12:24:45.554939 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" event={"ID":"e6986f16-e143-49f4-81e5-58abba717876","Type":"ContainerStarted","Data":"7cc82d3fd71f5170b65dc07b84d25573c7febed498176a66fed2dfd3b4619643"} Mar 13 12:24:46 crc kubenswrapper[4837]: I0313 12:24:46.564001 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" event={"ID":"e6986f16-e143-49f4-81e5-58abba717876","Type":"ContainerStarted","Data":"18a83cd1cba4b0ec8cbb0763088a8fc20438f178de5bb307e3e42d268b1d9ec5"} Mar 13 12:24:46 crc kubenswrapper[4837]: I0313 12:24:46.586855 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" podStartSLOduration=2.113208818 podStartE2EDuration="2.586839393s" podCreationTimestamp="2026-03-13 12:24:44 +0000 UTC" firstStartedPulling="2026-03-13 12:24:45.52783777 +0000 UTC m=+2201.166104533" lastFinishedPulling="2026-03-13 12:24:46.001468335 +0000 UTC m=+2201.639735108" observedRunningTime="2026-03-13 12:24:46.580987969 +0000 UTC m=+2202.219254752" watchObservedRunningTime="2026-03-13 12:24:46.586839393 +0000 UTC m=+2202.225106156" Mar 13 12:25:05 crc kubenswrapper[4837]: I0313 12:25:05.484372 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:25:05 crc kubenswrapper[4837]: I0313 12:25:05.485111 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:25:35 crc kubenswrapper[4837]: I0313 12:25:35.483950 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:25:35 crc kubenswrapper[4837]: I0313 12:25:35.484978 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:25:54 crc kubenswrapper[4837]: I0313 12:25:54.978198 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j6t59"] Mar 13 12:25:54 crc kubenswrapper[4837]: I0313 12:25:54.981197 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:25:54 crc kubenswrapper[4837]: I0313 12:25:54.993727 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j6t59"] Mar 13 12:25:55 crc kubenswrapper[4837]: I0313 12:25:55.068487 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69jzh\" (UniqueName: \"kubernetes.io/projected/5739768e-3825-4869-9a20-d65269d6ff6e-kube-api-access-69jzh\") pod \"community-operators-j6t59\" (UID: \"5739768e-3825-4869-9a20-d65269d6ff6e\") " pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:25:55 crc kubenswrapper[4837]: I0313 12:25:55.068698 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5739768e-3825-4869-9a20-d65269d6ff6e-catalog-content\") pod \"community-operators-j6t59\" (UID: \"5739768e-3825-4869-9a20-d65269d6ff6e\") " pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:25:55 crc kubenswrapper[4837]: I0313 12:25:55.068729 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5739768e-3825-4869-9a20-d65269d6ff6e-utilities\") pod \"community-operators-j6t59\" (UID: \"5739768e-3825-4869-9a20-d65269d6ff6e\") " pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:25:55 crc kubenswrapper[4837]: I0313 12:25:55.170750 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69jzh\" (UniqueName: \"kubernetes.io/projected/5739768e-3825-4869-9a20-d65269d6ff6e-kube-api-access-69jzh\") pod \"community-operators-j6t59\" (UID: \"5739768e-3825-4869-9a20-d65269d6ff6e\") " pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:25:55 crc kubenswrapper[4837]: I0313 12:25:55.170924 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5739768e-3825-4869-9a20-d65269d6ff6e-catalog-content\") pod \"community-operators-j6t59\" (UID: \"5739768e-3825-4869-9a20-d65269d6ff6e\") " pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:25:55 crc kubenswrapper[4837]: I0313 12:25:55.170946 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5739768e-3825-4869-9a20-d65269d6ff6e-utilities\") pod \"community-operators-j6t59\" (UID: \"5739768e-3825-4869-9a20-d65269d6ff6e\") " pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:25:55 crc kubenswrapper[4837]: I0313 12:25:55.171436 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5739768e-3825-4869-9a20-d65269d6ff6e-utilities\") pod \"community-operators-j6t59\" (UID: \"5739768e-3825-4869-9a20-d65269d6ff6e\") " pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:25:55 crc kubenswrapper[4837]: I0313 12:25:55.171611 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5739768e-3825-4869-9a20-d65269d6ff6e-catalog-content\") pod \"community-operators-j6t59\" (UID: \"5739768e-3825-4869-9a20-d65269d6ff6e\") " pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:25:55 crc kubenswrapper[4837]: I0313 12:25:55.195002 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69jzh\" (UniqueName: \"kubernetes.io/projected/5739768e-3825-4869-9a20-d65269d6ff6e-kube-api-access-69jzh\") pod \"community-operators-j6t59\" (UID: \"5739768e-3825-4869-9a20-d65269d6ff6e\") " pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:25:55 crc kubenswrapper[4837]: I0313 12:25:55.313989 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:25:55 crc kubenswrapper[4837]: I0313 12:25:55.806572 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j6t59"] Mar 13 12:25:56 crc kubenswrapper[4837]: I0313 12:25:56.397713 4837 generic.go:334] "Generic (PLEG): container finished" podID="5739768e-3825-4869-9a20-d65269d6ff6e" containerID="8ec46fdf64b8b41f5ed86e596fecb59aef4dd2e3445048553d57b57f44cdf29d" exitCode=0 Mar 13 12:25:56 crc kubenswrapper[4837]: I0313 12:25:56.398153 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6t59" event={"ID":"5739768e-3825-4869-9a20-d65269d6ff6e","Type":"ContainerDied","Data":"8ec46fdf64b8b41f5ed86e596fecb59aef4dd2e3445048553d57b57f44cdf29d"} Mar 13 12:25:56 crc kubenswrapper[4837]: I0313 12:25:56.398182 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6t59" event={"ID":"5739768e-3825-4869-9a20-d65269d6ff6e","Type":"ContainerStarted","Data":"6e5ab350326a007d05239aba067c1ae7270bb4feadf1120d3dff1a07c76500a1"} Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.367207 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n7dpp"] Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.369819 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.382487 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7dpp"] Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.415593 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6t59" event={"ID":"5739768e-3825-4869-9a20-d65269d6ff6e","Type":"ContainerStarted","Data":"18665a547215957278c7198a584f67fb5944ff6d98d64e5f47da861c793724c2"} Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.526930 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdac88ff-0567-4477-b88a-a90c2bc99da8-catalog-content\") pod \"redhat-marketplace-n7dpp\" (UID: \"fdac88ff-0567-4477-b88a-a90c2bc99da8\") " pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.526992 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c9hp\" (UniqueName: \"kubernetes.io/projected/fdac88ff-0567-4477-b88a-a90c2bc99da8-kube-api-access-7c9hp\") pod \"redhat-marketplace-n7dpp\" (UID: \"fdac88ff-0567-4477-b88a-a90c2bc99da8\") " pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.527029 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdac88ff-0567-4477-b88a-a90c2bc99da8-utilities\") pod \"redhat-marketplace-n7dpp\" (UID: \"fdac88ff-0567-4477-b88a-a90c2bc99da8\") " pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.629292 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdac88ff-0567-4477-b88a-a90c2bc99da8-utilities\") pod \"redhat-marketplace-n7dpp\" (UID: \"fdac88ff-0567-4477-b88a-a90c2bc99da8\") " pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.629559 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdac88ff-0567-4477-b88a-a90c2bc99da8-utilities\") pod \"redhat-marketplace-n7dpp\" (UID: \"fdac88ff-0567-4477-b88a-a90c2bc99da8\") " pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.629811 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdac88ff-0567-4477-b88a-a90c2bc99da8-catalog-content\") pod \"redhat-marketplace-n7dpp\" (UID: \"fdac88ff-0567-4477-b88a-a90c2bc99da8\") " pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.629863 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c9hp\" (UniqueName: \"kubernetes.io/projected/fdac88ff-0567-4477-b88a-a90c2bc99da8-kube-api-access-7c9hp\") pod \"redhat-marketplace-n7dpp\" (UID: \"fdac88ff-0567-4477-b88a-a90c2bc99da8\") " pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.630552 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdac88ff-0567-4477-b88a-a90c2bc99da8-catalog-content\") pod \"redhat-marketplace-n7dpp\" (UID: \"fdac88ff-0567-4477-b88a-a90c2bc99da8\") " pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.660295 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c9hp\" (UniqueName: \"kubernetes.io/projected/fdac88ff-0567-4477-b88a-a90c2bc99da8-kube-api-access-7c9hp\") pod \"redhat-marketplace-n7dpp\" (UID: \"fdac88ff-0567-4477-b88a-a90c2bc99da8\") " pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.691157 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:25:58 crc kubenswrapper[4837]: I0313 12:25:58.164746 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7dpp"] Mar 13 12:25:58 crc kubenswrapper[4837]: I0313 12:25:58.425746 4837 generic.go:334] "Generic (PLEG): container finished" podID="5739768e-3825-4869-9a20-d65269d6ff6e" containerID="18665a547215957278c7198a584f67fb5944ff6d98d64e5f47da861c793724c2" exitCode=0 Mar 13 12:25:58 crc kubenswrapper[4837]: I0313 12:25:58.425798 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6t59" event={"ID":"5739768e-3825-4869-9a20-d65269d6ff6e","Type":"ContainerDied","Data":"18665a547215957278c7198a584f67fb5944ff6d98d64e5f47da861c793724c2"} Mar 13 12:25:58 crc kubenswrapper[4837]: I0313 12:25:58.426884 4837 generic.go:334] "Generic (PLEG): container finished" podID="fdac88ff-0567-4477-b88a-a90c2bc99da8" containerID="f67572c8c6ce19fc30d5a363241b6294efe0fe117e547d75720e88fd9323c357" exitCode=0 Mar 13 12:25:58 crc kubenswrapper[4837]: I0313 12:25:58.426917 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7dpp" event={"ID":"fdac88ff-0567-4477-b88a-a90c2bc99da8","Type":"ContainerDied","Data":"f67572c8c6ce19fc30d5a363241b6294efe0fe117e547d75720e88fd9323c357"} Mar 13 12:25:58 crc kubenswrapper[4837]: I0313 12:25:58.426939 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7dpp" event={"ID":"fdac88ff-0567-4477-b88a-a90c2bc99da8","Type":"ContainerStarted","Data":"bc1c61427223537fb34eec963d583b2360136dfdbb62761ba6bcff070b488990"} Mar 13 12:25:59 crc kubenswrapper[4837]: I0313 12:25:59.436295 4837 generic.go:334] "Generic (PLEG): container finished" podID="fdac88ff-0567-4477-b88a-a90c2bc99da8" containerID="7b62114542f297a4d3e9e2cc215c273a290ed34518b0790734229b78c1fdfc3c" exitCode=0 Mar 13 12:25:59 crc kubenswrapper[4837]: I0313 12:25:59.436374 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7dpp" event={"ID":"fdac88ff-0567-4477-b88a-a90c2bc99da8","Type":"ContainerDied","Data":"7b62114542f297a4d3e9e2cc215c273a290ed34518b0790734229b78c1fdfc3c"} Mar 13 12:25:59 crc kubenswrapper[4837]: I0313 12:25:59.444398 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6t59" event={"ID":"5739768e-3825-4869-9a20-d65269d6ff6e","Type":"ContainerStarted","Data":"7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0"} Mar 13 12:25:59 crc kubenswrapper[4837]: I0313 12:25:59.475702 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j6t59" podStartSLOduration=3.011868767 podStartE2EDuration="5.475682274s" podCreationTimestamp="2026-03-13 12:25:54 +0000 UTC" firstStartedPulling="2026-03-13 12:25:56.39946636 +0000 UTC m=+2272.037733123" lastFinishedPulling="2026-03-13 12:25:58.863279847 +0000 UTC m=+2274.501546630" observedRunningTime="2026-03-13 12:25:59.473758253 +0000 UTC m=+2275.112025026" watchObservedRunningTime="2026-03-13 12:25:59.475682274 +0000 UTC m=+2275.113949037" Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.154753 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556746-vwjkq"] Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.156469 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556746-vwjkq" Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.158479 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.158975 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.159067 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.174388 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556746-vwjkq"] Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.206996 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxsh6\" (UniqueName: \"kubernetes.io/projected/eb9a9c7b-13fc-4655-91b2-a388c3870bf8-kube-api-access-lxsh6\") pod \"auto-csr-approver-29556746-vwjkq\" (UID: \"eb9a9c7b-13fc-4655-91b2-a388c3870bf8\") " pod="openshift-infra/auto-csr-approver-29556746-vwjkq" Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.309093 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxsh6\" (UniqueName: \"kubernetes.io/projected/eb9a9c7b-13fc-4655-91b2-a388c3870bf8-kube-api-access-lxsh6\") pod \"auto-csr-approver-29556746-vwjkq\" (UID: \"eb9a9c7b-13fc-4655-91b2-a388c3870bf8\") " pod="openshift-infra/auto-csr-approver-29556746-vwjkq" Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.329320 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxsh6\" (UniqueName: \"kubernetes.io/projected/eb9a9c7b-13fc-4655-91b2-a388c3870bf8-kube-api-access-lxsh6\") pod \"auto-csr-approver-29556746-vwjkq\" (UID: \"eb9a9c7b-13fc-4655-91b2-a388c3870bf8\") " pod="openshift-infra/auto-csr-approver-29556746-vwjkq" Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.453986 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7dpp" event={"ID":"fdac88ff-0567-4477-b88a-a90c2bc99da8","Type":"ContainerStarted","Data":"a5eee508b483e64f809508b03aa1f0b24998bdde6d37da5807abe3cdc59f087e"} Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.479516 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n7dpp" podStartSLOduration=2.097581473 podStartE2EDuration="3.479492118s" podCreationTimestamp="2026-03-13 12:25:57 +0000 UTC" firstStartedPulling="2026-03-13 12:25:58.431898867 +0000 UTC m=+2274.070165650" lastFinishedPulling="2026-03-13 12:25:59.813809532 +0000 UTC m=+2275.452076295" observedRunningTime="2026-03-13 12:26:00.469399402 +0000 UTC m=+2276.107666155" watchObservedRunningTime="2026-03-13 12:26:00.479492118 +0000 UTC m=+2276.117758881" Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.480595 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556746-vwjkq" Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.968474 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556746-vwjkq"] Mar 13 12:26:00 crc kubenswrapper[4837]: W0313 12:26:00.974034 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb9a9c7b_13fc_4655_91b2_a388c3870bf8.slice/crio-413d37f2c29c6b102d98d402abf30d47d543609ff1745230a949d42eb5a26c91 WatchSource:0}: Error finding container 413d37f2c29c6b102d98d402abf30d47d543609ff1745230a949d42eb5a26c91: Status 404 returned error can't find the container with id 413d37f2c29c6b102d98d402abf30d47d543609ff1745230a949d42eb5a26c91 Mar 13 12:26:01 crc kubenswrapper[4837]: I0313 12:26:01.467229 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556746-vwjkq" event={"ID":"eb9a9c7b-13fc-4655-91b2-a388c3870bf8","Type":"ContainerStarted","Data":"413d37f2c29c6b102d98d402abf30d47d543609ff1745230a949d42eb5a26c91"} Mar 13 12:26:02 crc kubenswrapper[4837]: I0313 12:26:02.489456 4837 generic.go:334] "Generic (PLEG): container finished" podID="eb9a9c7b-13fc-4655-91b2-a388c3870bf8" containerID="27d05aedac81655ab98a132d059aa69f642170fd7305465ba1bc55dadd819af6" exitCode=0 Mar 13 12:26:02 crc kubenswrapper[4837]: I0313 12:26:02.489531 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556746-vwjkq" event={"ID":"eb9a9c7b-13fc-4655-91b2-a388c3870bf8","Type":"ContainerDied","Data":"27d05aedac81655ab98a132d059aa69f642170fd7305465ba1bc55dadd819af6"} Mar 13 12:26:03 crc kubenswrapper[4837]: I0313 12:26:03.811494 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556746-vwjkq" Mar 13 12:26:03 crc kubenswrapper[4837]: I0313 12:26:03.875096 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxsh6\" (UniqueName: \"kubernetes.io/projected/eb9a9c7b-13fc-4655-91b2-a388c3870bf8-kube-api-access-lxsh6\") pod \"eb9a9c7b-13fc-4655-91b2-a388c3870bf8\" (UID: \"eb9a9c7b-13fc-4655-91b2-a388c3870bf8\") " Mar 13 12:26:03 crc kubenswrapper[4837]: I0313 12:26:03.883010 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb9a9c7b-13fc-4655-91b2-a388c3870bf8-kube-api-access-lxsh6" (OuterVolumeSpecName: "kube-api-access-lxsh6") pod "eb9a9c7b-13fc-4655-91b2-a388c3870bf8" (UID: "eb9a9c7b-13fc-4655-91b2-a388c3870bf8"). InnerVolumeSpecName "kube-api-access-lxsh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:26:03 crc kubenswrapper[4837]: I0313 12:26:03.978051 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxsh6\" (UniqueName: \"kubernetes.io/projected/eb9a9c7b-13fc-4655-91b2-a388c3870bf8-kube-api-access-lxsh6\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:04 crc kubenswrapper[4837]: I0313 12:26:04.514154 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556746-vwjkq" event={"ID":"eb9a9c7b-13fc-4655-91b2-a388c3870bf8","Type":"ContainerDied","Data":"413d37f2c29c6b102d98d402abf30d47d543609ff1745230a949d42eb5a26c91"} Mar 13 12:26:04 crc kubenswrapper[4837]: I0313 12:26:04.514209 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556746-vwjkq" Mar 13 12:26:04 crc kubenswrapper[4837]: I0313 12:26:04.514225 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="413d37f2c29c6b102d98d402abf30d47d543609ff1745230a949d42eb5a26c91" Mar 13 12:26:04 crc kubenswrapper[4837]: I0313 12:26:04.879602 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556740-snmw2"] Mar 13 12:26:04 crc kubenswrapper[4837]: I0313 12:26:04.887508 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556740-snmw2"] Mar 13 12:26:05 crc kubenswrapper[4837]: I0313 12:26:05.058454 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e01710d7-a463-41fe-9d86-2410a8ccd8e8" path="/var/lib/kubelet/pods/e01710d7-a463-41fe-9d86-2410a8ccd8e8/volumes" Mar 13 12:26:05 crc kubenswrapper[4837]: I0313 12:26:05.314619 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:26:05 crc kubenswrapper[4837]: I0313 12:26:05.314751 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:26:05 crc kubenswrapper[4837]: I0313 12:26:05.361566 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:26:05 crc kubenswrapper[4837]: I0313 12:26:05.483804 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:26:05 crc kubenswrapper[4837]: I0313 12:26:05.483877 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:26:05 crc kubenswrapper[4837]: I0313 12:26:05.483927 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 12:26:05 crc kubenswrapper[4837]: I0313 12:26:05.484821 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:26:05 crc kubenswrapper[4837]: I0313 12:26:05.484889 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" gracePeriod=600 Mar 13 12:26:05 crc kubenswrapper[4837]: I0313 12:26:05.581129 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:26:05 crc kubenswrapper[4837]: E0313 12:26:05.626866 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:26:05 crc kubenswrapper[4837]: I0313 12:26:05.629013 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j6t59"] Mar 13 12:26:06 crc kubenswrapper[4837]: I0313 12:26:06.533799 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" exitCode=0 Mar 13 12:26:06 crc kubenswrapper[4837]: I0313 12:26:06.533849 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504"} Mar 13 12:26:06 crc kubenswrapper[4837]: I0313 12:26:06.533907 4837 scope.go:117] "RemoveContainer" containerID="95ed8f8c7021ad56734ed8e8626e89cd8f2efcdf1bf9a33ce258f19439eeb037" Mar 13 12:26:06 crc kubenswrapper[4837]: I0313 12:26:06.534577 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:26:06 crc kubenswrapper[4837]: E0313 12:26:06.534828 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:26:07 crc kubenswrapper[4837]: I0313 12:26:07.545253 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j6t59" podUID="5739768e-3825-4869-9a20-d65269d6ff6e" containerName="registry-server" containerID="cri-o://7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0" gracePeriod=2 Mar 13 12:26:07 crc kubenswrapper[4837]: I0313 12:26:07.694934 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:26:07 crc kubenswrapper[4837]: I0313 12:26:07.694984 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:26:07 crc kubenswrapper[4837]: I0313 12:26:07.753171 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.012100 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.058814 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5739768e-3825-4869-9a20-d65269d6ff6e-catalog-content\") pod \"5739768e-3825-4869-9a20-d65269d6ff6e\" (UID: \"5739768e-3825-4869-9a20-d65269d6ff6e\") " Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.058898 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5739768e-3825-4869-9a20-d65269d6ff6e-utilities\") pod \"5739768e-3825-4869-9a20-d65269d6ff6e\" (UID: \"5739768e-3825-4869-9a20-d65269d6ff6e\") " Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.059057 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69jzh\" (UniqueName: \"kubernetes.io/projected/5739768e-3825-4869-9a20-d65269d6ff6e-kube-api-access-69jzh\") pod \"5739768e-3825-4869-9a20-d65269d6ff6e\" (UID: \"5739768e-3825-4869-9a20-d65269d6ff6e\") " Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.059978 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5739768e-3825-4869-9a20-d65269d6ff6e-utilities" (OuterVolumeSpecName: "utilities") pod "5739768e-3825-4869-9a20-d65269d6ff6e" (UID: "5739768e-3825-4869-9a20-d65269d6ff6e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.065421 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5739768e-3825-4869-9a20-d65269d6ff6e-kube-api-access-69jzh" (OuterVolumeSpecName: "kube-api-access-69jzh") pod "5739768e-3825-4869-9a20-d65269d6ff6e" (UID: "5739768e-3825-4869-9a20-d65269d6ff6e"). InnerVolumeSpecName "kube-api-access-69jzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.108613 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5739768e-3825-4869-9a20-d65269d6ff6e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5739768e-3825-4869-9a20-d65269d6ff6e" (UID: "5739768e-3825-4869-9a20-d65269d6ff6e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.162126 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5739768e-3825-4869-9a20-d65269d6ff6e-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.162169 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69jzh\" (UniqueName: \"kubernetes.io/projected/5739768e-3825-4869-9a20-d65269d6ff6e-kube-api-access-69jzh\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.162180 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5739768e-3825-4869-9a20-d65269d6ff6e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.556533 4837 generic.go:334] "Generic (PLEG): container finished" podID="5739768e-3825-4869-9a20-d65269d6ff6e" containerID="7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0" exitCode=0 Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.556601 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6t59" event={"ID":"5739768e-3825-4869-9a20-d65269d6ff6e","Type":"ContainerDied","Data":"7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0"} Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.556652 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.556697 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6t59" event={"ID":"5739768e-3825-4869-9a20-d65269d6ff6e","Type":"ContainerDied","Data":"6e5ab350326a007d05239aba067c1ae7270bb4feadf1120d3dff1a07c76500a1"} Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.556751 4837 scope.go:117] "RemoveContainer" containerID="7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.579937 4837 scope.go:117] "RemoveContainer" containerID="18665a547215957278c7198a584f67fb5944ff6d98d64e5f47da861c793724c2" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.603489 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j6t59"] Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.618739 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.625285 4837 scope.go:117] "RemoveContainer" containerID="8ec46fdf64b8b41f5ed86e596fecb59aef4dd2e3445048553d57b57f44cdf29d" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.627199 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j6t59"] Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.656462 4837 scope.go:117] "RemoveContainer" containerID="7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0" Mar 13 12:26:08 crc kubenswrapper[4837]: E0313 12:26:08.657028 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0\": container with ID starting with 7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0 not found: ID does not exist" containerID="7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.657078 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0"} err="failed to get container status \"7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0\": rpc error: code = NotFound desc = could not find container \"7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0\": container with ID starting with 7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0 not found: ID does not exist" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.657110 4837 scope.go:117] "RemoveContainer" containerID="18665a547215957278c7198a584f67fb5944ff6d98d64e5f47da861c793724c2" Mar 13 12:26:08 crc kubenswrapper[4837]: E0313 12:26:08.657481 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18665a547215957278c7198a584f67fb5944ff6d98d64e5f47da861c793724c2\": container with ID starting with 18665a547215957278c7198a584f67fb5944ff6d98d64e5f47da861c793724c2 not found: ID does not exist" containerID="18665a547215957278c7198a584f67fb5944ff6d98d64e5f47da861c793724c2" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.657524 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18665a547215957278c7198a584f67fb5944ff6d98d64e5f47da861c793724c2"} err="failed to get container status \"18665a547215957278c7198a584f67fb5944ff6d98d64e5f47da861c793724c2\": rpc error: code = NotFound desc = could not find container \"18665a547215957278c7198a584f67fb5944ff6d98d64e5f47da861c793724c2\": container with ID starting with 18665a547215957278c7198a584f67fb5944ff6d98d64e5f47da861c793724c2 not found: ID does not exist" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.657551 4837 scope.go:117] "RemoveContainer" containerID="8ec46fdf64b8b41f5ed86e596fecb59aef4dd2e3445048553d57b57f44cdf29d" Mar 13 12:26:08 crc kubenswrapper[4837]: E0313 12:26:08.657835 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ec46fdf64b8b41f5ed86e596fecb59aef4dd2e3445048553d57b57f44cdf29d\": container with ID starting with 8ec46fdf64b8b41f5ed86e596fecb59aef4dd2e3445048553d57b57f44cdf29d not found: ID does not exist" containerID="8ec46fdf64b8b41f5ed86e596fecb59aef4dd2e3445048553d57b57f44cdf29d" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.657865 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ec46fdf64b8b41f5ed86e596fecb59aef4dd2e3445048553d57b57f44cdf29d"} err="failed to get container status \"8ec46fdf64b8b41f5ed86e596fecb59aef4dd2e3445048553d57b57f44cdf29d\": rpc error: code = NotFound desc = could not find container \"8ec46fdf64b8b41f5ed86e596fecb59aef4dd2e3445048553d57b57f44cdf29d\": container with ID starting with 8ec46fdf64b8b41f5ed86e596fecb59aef4dd2e3445048553d57b57f44cdf29d not found: ID does not exist" Mar 13 12:26:09 crc kubenswrapper[4837]: I0313 12:26:09.072910 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5739768e-3825-4869-9a20-d65269d6ff6e" path="/var/lib/kubelet/pods/5739768e-3825-4869-9a20-d65269d6ff6e/volumes" Mar 13 12:26:12 crc kubenswrapper[4837]: I0313 12:26:12.202049 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7dpp"] Mar 13 12:26:12 crc kubenswrapper[4837]: I0313 12:26:12.202297 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n7dpp" podUID="fdac88ff-0567-4477-b88a-a90c2bc99da8" containerName="registry-server" containerID="cri-o://a5eee508b483e64f809508b03aa1f0b24998bdde6d37da5807abe3cdc59f087e" gracePeriod=2 Mar 13 12:26:12 crc kubenswrapper[4837]: I0313 12:26:12.607397 4837 generic.go:334] "Generic (PLEG): container finished" podID="fdac88ff-0567-4477-b88a-a90c2bc99da8" containerID="a5eee508b483e64f809508b03aa1f0b24998bdde6d37da5807abe3cdc59f087e" exitCode=0 Mar 13 12:26:12 crc kubenswrapper[4837]: I0313 12:26:12.607421 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7dpp" event={"ID":"fdac88ff-0567-4477-b88a-a90c2bc99da8","Type":"ContainerDied","Data":"a5eee508b483e64f809508b03aa1f0b24998bdde6d37da5807abe3cdc59f087e"} Mar 13 12:26:12 crc kubenswrapper[4837]: I0313 12:26:12.607787 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7dpp" event={"ID":"fdac88ff-0567-4477-b88a-a90c2bc99da8","Type":"ContainerDied","Data":"bc1c61427223537fb34eec963d583b2360136dfdbb62761ba6bcff070b488990"} Mar 13 12:26:12 crc kubenswrapper[4837]: I0313 12:26:12.607806 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc1c61427223537fb34eec963d583b2360136dfdbb62761ba6bcff070b488990" Mar 13 12:26:12 crc kubenswrapper[4837]: I0313 12:26:12.683227 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:26:12 crc kubenswrapper[4837]: I0313 12:26:12.746432 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdac88ff-0567-4477-b88a-a90c2bc99da8-catalog-content\") pod \"fdac88ff-0567-4477-b88a-a90c2bc99da8\" (UID: \"fdac88ff-0567-4477-b88a-a90c2bc99da8\") " Mar 13 12:26:12 crc kubenswrapper[4837]: I0313 12:26:12.746476 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c9hp\" (UniqueName: \"kubernetes.io/projected/fdac88ff-0567-4477-b88a-a90c2bc99da8-kube-api-access-7c9hp\") pod \"fdac88ff-0567-4477-b88a-a90c2bc99da8\" (UID: \"fdac88ff-0567-4477-b88a-a90c2bc99da8\") " Mar 13 12:26:12 crc kubenswrapper[4837]: I0313 12:26:12.746601 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdac88ff-0567-4477-b88a-a90c2bc99da8-utilities\") pod \"fdac88ff-0567-4477-b88a-a90c2bc99da8\" (UID: \"fdac88ff-0567-4477-b88a-a90c2bc99da8\") " Mar 13 12:26:12 crc kubenswrapper[4837]: I0313 12:26:12.747627 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdac88ff-0567-4477-b88a-a90c2bc99da8-utilities" (OuterVolumeSpecName: "utilities") pod "fdac88ff-0567-4477-b88a-a90c2bc99da8" (UID: "fdac88ff-0567-4477-b88a-a90c2bc99da8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:26:13 crc kubenswrapper[4837]: I0313 12:26:12.772915 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdac88ff-0567-4477-b88a-a90c2bc99da8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fdac88ff-0567-4477-b88a-a90c2bc99da8" (UID: "fdac88ff-0567-4477-b88a-a90c2bc99da8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:26:13 crc kubenswrapper[4837]: I0313 12:26:13.615457 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:26:13 crc kubenswrapper[4837]: I0313 12:26:13.643585 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdac88ff-0567-4477-b88a-a90c2bc99da8-kube-api-access-7c9hp" (OuterVolumeSpecName: "kube-api-access-7c9hp") pod "fdac88ff-0567-4477-b88a-a90c2bc99da8" (UID: "fdac88ff-0567-4477-b88a-a90c2bc99da8"). InnerVolumeSpecName "kube-api-access-7c9hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:26:13 crc kubenswrapper[4837]: I0313 12:26:13.648962 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdac88ff-0567-4477-b88a-a90c2bc99da8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:13 crc kubenswrapper[4837]: I0313 12:26:13.648993 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c9hp\" (UniqueName: \"kubernetes.io/projected/fdac88ff-0567-4477-b88a-a90c2bc99da8-kube-api-access-7c9hp\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:13 crc kubenswrapper[4837]: I0313 12:26:13.649006 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdac88ff-0567-4477-b88a-a90c2bc99da8-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:13 crc kubenswrapper[4837]: I0313 12:26:13.937934 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7dpp"] Mar 13 12:26:13 crc kubenswrapper[4837]: I0313 12:26:13.952404 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7dpp"] Mar 13 12:26:15 crc kubenswrapper[4837]: I0313 12:26:15.060814 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdac88ff-0567-4477-b88a-a90c2bc99da8" path="/var/lib/kubelet/pods/fdac88ff-0567-4477-b88a-a90c2bc99da8/volumes" Mar 13 12:26:18 crc kubenswrapper[4837]: I0313 12:26:18.048898 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:26:18 crc kubenswrapper[4837]: E0313 12:26:18.049326 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:26:29 crc kubenswrapper[4837]: I0313 12:26:29.049234 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:26:29 crc kubenswrapper[4837]: E0313 12:26:29.050514 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:26:30 crc kubenswrapper[4837]: I0313 12:26:30.355419 4837 scope.go:117] "RemoveContainer" containerID="5c53da3a56d1c8f877bdab4d65362dc1a8c31f8cd4991718456d0c1946898d66" Mar 13 12:26:40 crc kubenswrapper[4837]: I0313 12:26:40.048201 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:26:40 crc kubenswrapper[4837]: E0313 12:26:40.050572 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:26:51 crc kubenswrapper[4837]: I0313 12:26:51.952619 4837 generic.go:334] "Generic (PLEG): container finished" podID="e6986f16-e143-49f4-81e5-58abba717876" containerID="18a83cd1cba4b0ec8cbb0763088a8fc20438f178de5bb307e3e42d268b1d9ec5" exitCode=0 Mar 13 12:26:51 crc kubenswrapper[4837]: I0313 12:26:51.952692 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" event={"ID":"e6986f16-e143-49f4-81e5-58abba717876","Type":"ContainerDied","Data":"18a83cd1cba4b0ec8cbb0763088a8fc20438f178de5bb307e3e42d268b1d9ec5"} Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.404190 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.577348 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-2\") pod \"e6986f16-e143-49f4-81e5-58abba717876\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.577414 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-3\") pod \"e6986f16-e143-49f4-81e5-58abba717876\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.577439 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-ssh-key-openstack-edpm-ipam\") pod \"e6986f16-e143-49f4-81e5-58abba717876\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.577482 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-0\") pod \"e6986f16-e143-49f4-81e5-58abba717876\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.577514 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e6986f16-e143-49f4-81e5-58abba717876-nova-extra-config-0\") pod \"e6986f16-e143-49f4-81e5-58abba717876\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.577561 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzqgf\" (UniqueName: \"kubernetes.io/projected/e6986f16-e143-49f4-81e5-58abba717876-kube-api-access-fzqgf\") pod \"e6986f16-e143-49f4-81e5-58abba717876\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.577615 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-migration-ssh-key-0\") pod \"e6986f16-e143-49f4-81e5-58abba717876\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.577668 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-inventory\") pod \"e6986f16-e143-49f4-81e5-58abba717876\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.577743 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-1\") pod \"e6986f16-e143-49f4-81e5-58abba717876\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.577796 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-combined-ca-bundle\") pod \"e6986f16-e143-49f4-81e5-58abba717876\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.577833 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-migration-ssh-key-1\") pod \"e6986f16-e143-49f4-81e5-58abba717876\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.583587 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6986f16-e143-49f4-81e5-58abba717876-kube-api-access-fzqgf" (OuterVolumeSpecName: "kube-api-access-fzqgf") pod "e6986f16-e143-49f4-81e5-58abba717876" (UID: "e6986f16-e143-49f4-81e5-58abba717876"). InnerVolumeSpecName "kube-api-access-fzqgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.583738 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "e6986f16-e143-49f4-81e5-58abba717876" (UID: "e6986f16-e143-49f4-81e5-58abba717876"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.607206 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "e6986f16-e143-49f4-81e5-58abba717876" (UID: "e6986f16-e143-49f4-81e5-58abba717876"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.607300 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "e6986f16-e143-49f4-81e5-58abba717876" (UID: "e6986f16-e143-49f4-81e5-58abba717876"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.608596 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e6986f16-e143-49f4-81e5-58abba717876" (UID: "e6986f16-e143-49f4-81e5-58abba717876"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.609197 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "e6986f16-e143-49f4-81e5-58abba717876" (UID: "e6986f16-e143-49f4-81e5-58abba717876"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.613836 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "e6986f16-e143-49f4-81e5-58abba717876" (UID: "e6986f16-e143-49f4-81e5-58abba717876"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.615620 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-inventory" (OuterVolumeSpecName: "inventory") pod "e6986f16-e143-49f4-81e5-58abba717876" (UID: "e6986f16-e143-49f4-81e5-58abba717876"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.617918 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "e6986f16-e143-49f4-81e5-58abba717876" (UID: "e6986f16-e143-49f4-81e5-58abba717876"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.628116 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "e6986f16-e143-49f4-81e5-58abba717876" (UID: "e6986f16-e143-49f4-81e5-58abba717876"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.642824 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6986f16-e143-49f4-81e5-58abba717876-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "e6986f16-e143-49f4-81e5-58abba717876" (UID: "e6986f16-e143-49f4-81e5-58abba717876"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.678755 4837 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.678794 4837 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.678803 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.678815 4837 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.678825 4837 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e6986f16-e143-49f4-81e5-58abba717876-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.678833 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzqgf\" (UniqueName: \"kubernetes.io/projected/e6986f16-e143-49f4-81e5-58abba717876-kube-api-access-fzqgf\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.678842 4837 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.678852 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.678861 4837 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.678869 4837 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.678878 4837 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.972684 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" event={"ID":"e6986f16-e143-49f4-81e5-58abba717876","Type":"ContainerDied","Data":"7cc82d3fd71f5170b65dc07b84d25573c7febed498176a66fed2dfd3b4619643"} Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.972992 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cc82d3fd71f5170b65dc07b84d25573c7febed498176a66fed2dfd3b4619643" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.972725 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.049557 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:26:54 crc kubenswrapper[4837]: E0313 12:26:54.050023 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.083558 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x"] Mar 13 12:26:54 crc kubenswrapper[4837]: E0313 12:26:54.084135 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb9a9c7b-13fc-4655-91b2-a388c3870bf8" containerName="oc" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.084157 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb9a9c7b-13fc-4655-91b2-a388c3870bf8" containerName="oc" Mar 13 12:26:54 crc kubenswrapper[4837]: E0313 12:26:54.084170 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdac88ff-0567-4477-b88a-a90c2bc99da8" containerName="extract-utilities" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.084177 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdac88ff-0567-4477-b88a-a90c2bc99da8" containerName="extract-utilities" Mar 13 12:26:54 crc kubenswrapper[4837]: E0313 12:26:54.084199 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5739768e-3825-4869-9a20-d65269d6ff6e" containerName="extract-content" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.084205 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5739768e-3825-4869-9a20-d65269d6ff6e" containerName="extract-content" Mar 13 12:26:54 crc kubenswrapper[4837]: E0313 12:26:54.084215 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5739768e-3825-4869-9a20-d65269d6ff6e" containerName="extract-utilities" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.084220 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5739768e-3825-4869-9a20-d65269d6ff6e" containerName="extract-utilities" Mar 13 12:26:54 crc kubenswrapper[4837]: E0313 12:26:54.084231 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5739768e-3825-4869-9a20-d65269d6ff6e" containerName="registry-server" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.084237 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5739768e-3825-4869-9a20-d65269d6ff6e" containerName="registry-server" Mar 13 12:26:54 crc kubenswrapper[4837]: E0313 12:26:54.084255 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdac88ff-0567-4477-b88a-a90c2bc99da8" containerName="registry-server" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.084260 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdac88ff-0567-4477-b88a-a90c2bc99da8" containerName="registry-server" Mar 13 12:26:54 crc kubenswrapper[4837]: E0313 12:26:54.084277 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdac88ff-0567-4477-b88a-a90c2bc99da8" containerName="extract-content" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.084283 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdac88ff-0567-4477-b88a-a90c2bc99da8" containerName="extract-content" Mar 13 12:26:54 crc kubenswrapper[4837]: E0313 12:26:54.084298 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6986f16-e143-49f4-81e5-58abba717876" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.084304 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6986f16-e143-49f4-81e5-58abba717876" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.084717 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5739768e-3825-4869-9a20-d65269d6ff6e" containerName="registry-server" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.084767 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb9a9c7b-13fc-4655-91b2-a388c3870bf8" containerName="oc" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.084783 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdac88ff-0567-4477-b88a-a90c2bc99da8" containerName="registry-server" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.084816 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6986f16-e143-49f4-81e5-58abba717876" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.085726 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.088169 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.088473 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.089358 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.089443 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.090350 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.097704 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x"] Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.289689 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.289739 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.289843 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.289976 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.290043 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.290123 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.290179 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7pcx\" (UniqueName: \"kubernetes.io/projected/ac15848f-4f6f-4159-828f-d30a77f93a4b-kube-api-access-w7pcx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.391802 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.391930 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7pcx\" (UniqueName: \"kubernetes.io/projected/ac15848f-4f6f-4159-828f-d30a77f93a4b-kube-api-access-w7pcx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.391975 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.392001 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.392045 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.392146 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.392204 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.397301 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.397812 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.398092 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.398215 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.399203 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.407158 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.411886 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7pcx\" (UniqueName: \"kubernetes.io/projected/ac15848f-4f6f-4159-828f-d30a77f93a4b-kube-api-access-w7pcx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.702412 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:55 crc kubenswrapper[4837]: I0313 12:26:55.277905 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x"] Mar 13 12:26:56 crc kubenswrapper[4837]: I0313 12:26:56.002298 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" event={"ID":"ac15848f-4f6f-4159-828f-d30a77f93a4b","Type":"ContainerStarted","Data":"66c3c97b0179ed9e241ac7948bb5392842023bac50cbf42e15411d47abcc3b2e"} Mar 13 12:26:56 crc kubenswrapper[4837]: I0313 12:26:56.002869 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" event={"ID":"ac15848f-4f6f-4159-828f-d30a77f93a4b","Type":"ContainerStarted","Data":"b7bde5f05c7e3a4c4bb83a3a8eca2fac351228924917d2ac26964f39056c8c9f"} Mar 13 12:26:56 crc kubenswrapper[4837]: I0313 12:26:56.030521 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" podStartSLOduration=1.605321947 podStartE2EDuration="2.030499163s" podCreationTimestamp="2026-03-13 12:26:54 +0000 UTC" firstStartedPulling="2026-03-13 12:26:55.284040996 +0000 UTC m=+2330.922307759" lastFinishedPulling="2026-03-13 12:26:55.709218212 +0000 UTC m=+2331.347484975" observedRunningTime="2026-03-13 12:26:56.020888892 +0000 UTC m=+2331.659155675" watchObservedRunningTime="2026-03-13 12:26:56.030499163 +0000 UTC m=+2331.668765926" Mar 13 12:27:09 crc kubenswrapper[4837]: I0313 12:27:09.055206 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:27:09 crc kubenswrapper[4837]: E0313 12:27:09.056774 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:27:20 crc kubenswrapper[4837]: I0313 12:27:20.048819 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:27:20 crc kubenswrapper[4837]: E0313 12:27:20.050111 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:27:31 crc kubenswrapper[4837]: I0313 12:27:31.049334 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:27:31 crc kubenswrapper[4837]: E0313 12:27:31.050686 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:27:43 crc kubenswrapper[4837]: I0313 12:27:43.048720 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:27:43 crc kubenswrapper[4837]: E0313 12:27:43.049783 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:27:54 crc kubenswrapper[4837]: I0313 12:27:54.048928 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:27:54 crc kubenswrapper[4837]: E0313 12:27:54.049509 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:28:00 crc kubenswrapper[4837]: I0313 12:28:00.138931 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556748-dgqh7"] Mar 13 12:28:00 crc kubenswrapper[4837]: I0313 12:28:00.140441 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556748-dgqh7" Mar 13 12:28:00 crc kubenswrapper[4837]: I0313 12:28:00.144868 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:28:00 crc kubenswrapper[4837]: I0313 12:28:00.144872 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:28:00 crc kubenswrapper[4837]: I0313 12:28:00.145032 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:28:00 crc kubenswrapper[4837]: I0313 12:28:00.161861 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556748-dgqh7"] Mar 13 12:28:00 crc kubenswrapper[4837]: I0313 12:28:00.264278 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlhlt\" (UniqueName: \"kubernetes.io/projected/72d56baf-1d17-4cbb-a351-8f5bf373c768-kube-api-access-dlhlt\") pod \"auto-csr-approver-29556748-dgqh7\" (UID: \"72d56baf-1d17-4cbb-a351-8f5bf373c768\") " pod="openshift-infra/auto-csr-approver-29556748-dgqh7" Mar 13 12:28:00 crc kubenswrapper[4837]: I0313 12:28:00.366755 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlhlt\" (UniqueName: \"kubernetes.io/projected/72d56baf-1d17-4cbb-a351-8f5bf373c768-kube-api-access-dlhlt\") pod \"auto-csr-approver-29556748-dgqh7\" (UID: \"72d56baf-1d17-4cbb-a351-8f5bf373c768\") " pod="openshift-infra/auto-csr-approver-29556748-dgqh7" Mar 13 12:28:00 crc kubenswrapper[4837]: I0313 12:28:00.385268 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlhlt\" (UniqueName: \"kubernetes.io/projected/72d56baf-1d17-4cbb-a351-8f5bf373c768-kube-api-access-dlhlt\") pod \"auto-csr-approver-29556748-dgqh7\" (UID: \"72d56baf-1d17-4cbb-a351-8f5bf373c768\") " pod="openshift-infra/auto-csr-approver-29556748-dgqh7" Mar 13 12:28:00 crc kubenswrapper[4837]: I0313 12:28:00.470427 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556748-dgqh7" Mar 13 12:28:00 crc kubenswrapper[4837]: I0313 12:28:00.955774 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556748-dgqh7"] Mar 13 12:28:01 crc kubenswrapper[4837]: I0313 12:28:01.562691 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556748-dgqh7" event={"ID":"72d56baf-1d17-4cbb-a351-8f5bf373c768","Type":"ContainerStarted","Data":"16962aea59727abab979a02d6decfef8fe22e9a60374e99da3395d6326096e2e"} Mar 13 12:28:02 crc kubenswrapper[4837]: I0313 12:28:02.576856 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556748-dgqh7" event={"ID":"72d56baf-1d17-4cbb-a351-8f5bf373c768","Type":"ContainerStarted","Data":"1f6ecae5057b8984cf0bda7716bd86797af96a5f4c3a84ef8f5f85cb3c1def23"} Mar 13 12:28:02 crc kubenswrapper[4837]: I0313 12:28:02.597772 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556748-dgqh7" podStartSLOduration=1.393065649 podStartE2EDuration="2.59774881s" podCreationTimestamp="2026-03-13 12:28:00 +0000 UTC" firstStartedPulling="2026-03-13 12:28:00.960944594 +0000 UTC m=+2396.599211357" lastFinishedPulling="2026-03-13 12:28:02.165627755 +0000 UTC m=+2397.803894518" observedRunningTime="2026-03-13 12:28:02.589165051 +0000 UTC m=+2398.227431824" watchObservedRunningTime="2026-03-13 12:28:02.59774881 +0000 UTC m=+2398.236015573" Mar 13 12:28:03 crc kubenswrapper[4837]: I0313 12:28:03.588869 4837 generic.go:334] "Generic (PLEG): container finished" podID="72d56baf-1d17-4cbb-a351-8f5bf373c768" containerID="1f6ecae5057b8984cf0bda7716bd86797af96a5f4c3a84ef8f5f85cb3c1def23" exitCode=0 Mar 13 12:28:03 crc kubenswrapper[4837]: I0313 12:28:03.588909 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556748-dgqh7" event={"ID":"72d56baf-1d17-4cbb-a351-8f5bf373c768","Type":"ContainerDied","Data":"1f6ecae5057b8984cf0bda7716bd86797af96a5f4c3a84ef8f5f85cb3c1def23"} Mar 13 12:28:04 crc kubenswrapper[4837]: I0313 12:28:04.931328 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556748-dgqh7" Mar 13 12:28:04 crc kubenswrapper[4837]: I0313 12:28:04.956016 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlhlt\" (UniqueName: \"kubernetes.io/projected/72d56baf-1d17-4cbb-a351-8f5bf373c768-kube-api-access-dlhlt\") pod \"72d56baf-1d17-4cbb-a351-8f5bf373c768\" (UID: \"72d56baf-1d17-4cbb-a351-8f5bf373c768\") " Mar 13 12:28:04 crc kubenswrapper[4837]: I0313 12:28:04.967792 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72d56baf-1d17-4cbb-a351-8f5bf373c768-kube-api-access-dlhlt" (OuterVolumeSpecName: "kube-api-access-dlhlt") pod "72d56baf-1d17-4cbb-a351-8f5bf373c768" (UID: "72d56baf-1d17-4cbb-a351-8f5bf373c768"). InnerVolumeSpecName "kube-api-access-dlhlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:28:05 crc kubenswrapper[4837]: I0313 12:28:05.057837 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlhlt\" (UniqueName: \"kubernetes.io/projected/72d56baf-1d17-4cbb-a351-8f5bf373c768-kube-api-access-dlhlt\") on node \"crc\" DevicePath \"\"" Mar 13 12:28:05 crc kubenswrapper[4837]: I0313 12:28:05.610032 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556748-dgqh7" event={"ID":"72d56baf-1d17-4cbb-a351-8f5bf373c768","Type":"ContainerDied","Data":"16962aea59727abab979a02d6decfef8fe22e9a60374e99da3395d6326096e2e"} Mar 13 12:28:05 crc kubenswrapper[4837]: I0313 12:28:05.610085 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16962aea59727abab979a02d6decfef8fe22e9a60374e99da3395d6326096e2e" Mar 13 12:28:05 crc kubenswrapper[4837]: I0313 12:28:05.610157 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556748-dgqh7" Mar 13 12:28:05 crc kubenswrapper[4837]: I0313 12:28:05.660575 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556742-5ggnq"] Mar 13 12:28:05 crc kubenswrapper[4837]: I0313 12:28:05.670747 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556742-5ggnq"] Mar 13 12:28:07 crc kubenswrapper[4837]: I0313 12:28:07.049034 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:28:07 crc kubenswrapper[4837]: E0313 12:28:07.049367 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:28:07 crc kubenswrapper[4837]: I0313 12:28:07.061466 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aed6dbbf-3a09-4b60-9757-7c74a07f9c63" path="/var/lib/kubelet/pods/aed6dbbf-3a09-4b60-9757-7c74a07f9c63/volumes" Mar 13 12:28:13 crc kubenswrapper[4837]: E0313 12:28:13.217512 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72d56baf_1d17_4cbb_a351_8f5bf373c768.slice/crio-16962aea59727abab979a02d6decfef8fe22e9a60374e99da3395d6326096e2e\": RecentStats: unable to find data in memory cache]" Mar 13 12:28:18 crc kubenswrapper[4837]: I0313 12:28:18.048611 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:28:18 crc kubenswrapper[4837]: E0313 12:28:18.049087 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:28:23 crc kubenswrapper[4837]: E0313 12:28:23.476257 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72d56baf_1d17_4cbb_a351_8f5bf373c768.slice/crio-16962aea59727abab979a02d6decfef8fe22e9a60374e99da3395d6326096e2e\": RecentStats: unable to find data in memory cache]" Mar 13 12:28:30 crc kubenswrapper[4837]: I0313 12:28:30.464013 4837 scope.go:117] "RemoveContainer" containerID="852beb2b4218c3ee146b9596afb327ce3ec642be20ae0116d12166c03475804d" Mar 13 12:28:33 crc kubenswrapper[4837]: I0313 12:28:33.048747 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:28:33 crc kubenswrapper[4837]: E0313 12:28:33.049254 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:28:33 crc kubenswrapper[4837]: E0313 12:28:33.770898 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72d56baf_1d17_4cbb_a351_8f5bf373c768.slice/crio-16962aea59727abab979a02d6decfef8fe22e9a60374e99da3395d6326096e2e\": RecentStats: unable to find data in memory cache]" Mar 13 12:28:44 crc kubenswrapper[4837]: E0313 12:28:44.030134 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72d56baf_1d17_4cbb_a351_8f5bf373c768.slice/crio-16962aea59727abab979a02d6decfef8fe22e9a60374e99da3395d6326096e2e\": RecentStats: unable to find data in memory cache]" Mar 13 12:28:45 crc kubenswrapper[4837]: I0313 12:28:45.057411 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:28:45 crc kubenswrapper[4837]: E0313 12:28:45.058077 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:28:54 crc kubenswrapper[4837]: E0313 12:28:54.260624 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72d56baf_1d17_4cbb_a351_8f5bf373c768.slice/crio-16962aea59727abab979a02d6decfef8fe22e9a60374e99da3395d6326096e2e\": RecentStats: unable to find data in memory cache]" Mar 13 12:28:56 crc kubenswrapper[4837]: I0313 12:28:56.048457 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:28:56 crc kubenswrapper[4837]: E0313 12:28:56.049174 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:29:01 crc kubenswrapper[4837]: I0313 12:29:01.113592 4837 generic.go:334] "Generic (PLEG): container finished" podID="ac15848f-4f6f-4159-828f-d30a77f93a4b" containerID="66c3c97b0179ed9e241ac7948bb5392842023bac50cbf42e15411d47abcc3b2e" exitCode=0 Mar 13 12:29:01 crc kubenswrapper[4837]: I0313 12:29:01.113720 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" event={"ID":"ac15848f-4f6f-4159-828f-d30a77f93a4b","Type":"ContainerDied","Data":"66c3c97b0179ed9e241ac7948bb5392842023bac50cbf42e15411d47abcc3b2e"} Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.506280 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.684528 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-2\") pod \"ac15848f-4f6f-4159-828f-d30a77f93a4b\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.684711 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ssh-key-openstack-edpm-ipam\") pod \"ac15848f-4f6f-4159-828f-d30a77f93a4b\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.684780 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-1\") pod \"ac15848f-4f6f-4159-828f-d30a77f93a4b\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.684811 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7pcx\" (UniqueName: \"kubernetes.io/projected/ac15848f-4f6f-4159-828f-d30a77f93a4b-kube-api-access-w7pcx\") pod \"ac15848f-4f6f-4159-828f-d30a77f93a4b\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.684891 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-0\") pod \"ac15848f-4f6f-4159-828f-d30a77f93a4b\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.684920 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-inventory\") pod \"ac15848f-4f6f-4159-828f-d30a77f93a4b\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.685034 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-telemetry-combined-ca-bundle\") pod \"ac15848f-4f6f-4159-828f-d30a77f93a4b\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.691519 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "ac15848f-4f6f-4159-828f-d30a77f93a4b" (UID: "ac15848f-4f6f-4159-828f-d30a77f93a4b"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.691919 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac15848f-4f6f-4159-828f-d30a77f93a4b-kube-api-access-w7pcx" (OuterVolumeSpecName: "kube-api-access-w7pcx") pod "ac15848f-4f6f-4159-828f-d30a77f93a4b" (UID: "ac15848f-4f6f-4159-828f-d30a77f93a4b"). InnerVolumeSpecName "kube-api-access-w7pcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.712805 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "ac15848f-4f6f-4159-828f-d30a77f93a4b" (UID: "ac15848f-4f6f-4159-828f-d30a77f93a4b"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.715968 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ac15848f-4f6f-4159-828f-d30a77f93a4b" (UID: "ac15848f-4f6f-4159-828f-d30a77f93a4b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.716440 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-inventory" (OuterVolumeSpecName: "inventory") pod "ac15848f-4f6f-4159-828f-d30a77f93a4b" (UID: "ac15848f-4f6f-4159-828f-d30a77f93a4b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.717234 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "ac15848f-4f6f-4159-828f-d30a77f93a4b" (UID: "ac15848f-4f6f-4159-828f-d30a77f93a4b"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.723815 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "ac15848f-4f6f-4159-828f-d30a77f93a4b" (UID: "ac15848f-4f6f-4159-828f-d30a77f93a4b"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.787877 4837 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.788210 4837 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.788347 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.788412 4837 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.788475 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7pcx\" (UniqueName: \"kubernetes.io/projected/ac15848f-4f6f-4159-828f-d30a77f93a4b-kube-api-access-w7pcx\") on node \"crc\" DevicePath \"\"" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.788628 4837 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.788725 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:29:03 crc kubenswrapper[4837]: I0313 12:29:03.133613 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" event={"ID":"ac15848f-4f6f-4159-828f-d30a77f93a4b","Type":"ContainerDied","Data":"b7bde5f05c7e3a4c4bb83a3a8eca2fac351228924917d2ac26964f39056c8c9f"} Mar 13 12:29:03 crc kubenswrapper[4837]: I0313 12:29:03.133974 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7bde5f05c7e3a4c4bb83a3a8eca2fac351228924917d2ac26964f39056c8c9f" Mar 13 12:29:03 crc kubenswrapper[4837]: I0313 12:29:03.133689 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:29:04 crc kubenswrapper[4837]: E0313 12:29:04.486011 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72d56baf_1d17_4cbb_a351_8f5bf373c768.slice/crio-16962aea59727abab979a02d6decfef8fe22e9a60374e99da3395d6326096e2e\": RecentStats: unable to find data in memory cache]" Mar 13 12:29:07 crc kubenswrapper[4837]: I0313 12:29:07.048682 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:29:07 crc kubenswrapper[4837]: E0313 12:29:07.049514 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:29:21 crc kubenswrapper[4837]: I0313 12:29:21.048109 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:29:21 crc kubenswrapper[4837]: E0313 12:29:21.048758 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:29:35 crc kubenswrapper[4837]: I0313 12:29:35.048449 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:29:35 crc kubenswrapper[4837]: E0313 12:29:35.049215 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:29:50 crc kubenswrapper[4837]: I0313 12:29:50.048241 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:29:50 crc kubenswrapper[4837]: E0313 12:29:50.049083 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:29:59 crc kubenswrapper[4837]: I0313 12:29:59.965921 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 13 12:29:59 crc kubenswrapper[4837]: E0313 12:29:59.967401 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72d56baf-1d17-4cbb-a351-8f5bf373c768" containerName="oc" Mar 13 12:29:59 crc kubenswrapper[4837]: I0313 12:29:59.967436 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="72d56baf-1d17-4cbb-a351-8f5bf373c768" containerName="oc" Mar 13 12:29:59 crc kubenswrapper[4837]: E0313 12:29:59.967460 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac15848f-4f6f-4159-828f-d30a77f93a4b" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 13 12:29:59 crc kubenswrapper[4837]: I0313 12:29:59.967467 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac15848f-4f6f-4159-828f-d30a77f93a4b" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 13 12:29:59 crc kubenswrapper[4837]: I0313 12:29:59.967810 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac15848f-4f6f-4159-828f-d30a77f93a4b" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 13 12:29:59 crc kubenswrapper[4837]: I0313 12:29:59.967874 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="72d56baf-1d17-4cbb-a351-8f5bf373c768" containerName="oc" Mar 13 12:29:59 crc kubenswrapper[4837]: I0313 12:29:59.968869 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 13 12:29:59 crc kubenswrapper[4837]: I0313 12:29:59.972935 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 13 12:29:59 crc kubenswrapper[4837]: I0313 12:29:59.973333 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 13 12:29:59 crc kubenswrapper[4837]: I0313 12:29:59.973066 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bvdx7" Mar 13 12:29:59 crc kubenswrapper[4837]: I0313 12:29:59.973163 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 13 12:29:59 crc kubenswrapper[4837]: I0313 12:29:59.976328 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.035211 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66bdda91-c5b6-4879-9adf-21846884c797-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.035294 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.035332 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.035388 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66bdda91-c5b6-4879-9adf-21846884c797-config-data\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.035445 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/66bdda91-c5b6-4879-9adf-21846884c797-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.035481 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.035517 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/66bdda91-c5b6-4879-9adf-21846884c797-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.035558 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdksm\" (UniqueName: \"kubernetes.io/projected/66bdda91-c5b6-4879-9adf-21846884c797-kube-api-access-zdksm\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.035665 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.138042 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/66bdda91-c5b6-4879-9adf-21846884c797-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.138110 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.138153 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/66bdda91-c5b6-4879-9adf-21846884c797-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.138203 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdksm\" (UniqueName: \"kubernetes.io/projected/66bdda91-c5b6-4879-9adf-21846884c797-kube-api-access-zdksm\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.138786 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/66bdda91-c5b6-4879-9adf-21846884c797-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.138935 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/66bdda91-c5b6-4879-9adf-21846884c797-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.139097 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.139615 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66bdda91-c5b6-4879-9adf-21846884c797-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.139708 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.139752 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.139829 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66bdda91-c5b6-4879-9adf-21846884c797-config-data\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.141203 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66bdda91-c5b6-4879-9adf-21846884c797-config-data\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.142113 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.146508 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.151672 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.152143 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.152189 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66bdda91-c5b6-4879-9adf-21846884c797-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.152299 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556750-xl9g6"] Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.157554 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556750-xl9g6" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.161203 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.161393 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.162047 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.164048 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdksm\" (UniqueName: \"kubernetes.io/projected/66bdda91-c5b6-4879-9adf-21846884c797-kube-api-access-zdksm\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.167627 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln"] Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.169209 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.172507 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.174271 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.182163 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556750-xl9g6"] Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.192176 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln"] Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.194957 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.241502 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw492\" (UniqueName: \"kubernetes.io/projected/e39a0509-ea55-4b46-a3dc-473bb655cad8-kube-api-access-hw492\") pod \"auto-csr-approver-29556750-xl9g6\" (UID: \"e39a0509-ea55-4b46-a3dc-473bb655cad8\") " pod="openshift-infra/auto-csr-approver-29556750-xl9g6" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.241543 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76f7e035-5d7e-46a2-befe-a8e414e93d86-config-volume\") pod \"collect-profiles-29556750-l24ln\" (UID: \"76f7e035-5d7e-46a2-befe-a8e414e93d86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.241793 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qjw7\" (UniqueName: \"kubernetes.io/projected/76f7e035-5d7e-46a2-befe-a8e414e93d86-kube-api-access-4qjw7\") pod \"collect-profiles-29556750-l24ln\" (UID: \"76f7e035-5d7e-46a2-befe-a8e414e93d86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.241887 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76f7e035-5d7e-46a2-befe-a8e414e93d86-secret-volume\") pod \"collect-profiles-29556750-l24ln\" (UID: \"76f7e035-5d7e-46a2-befe-a8e414e93d86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.293616 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.343487 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qjw7\" (UniqueName: \"kubernetes.io/projected/76f7e035-5d7e-46a2-befe-a8e414e93d86-kube-api-access-4qjw7\") pod \"collect-profiles-29556750-l24ln\" (UID: \"76f7e035-5d7e-46a2-befe-a8e414e93d86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.343564 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76f7e035-5d7e-46a2-befe-a8e414e93d86-secret-volume\") pod \"collect-profiles-29556750-l24ln\" (UID: \"76f7e035-5d7e-46a2-befe-a8e414e93d86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.343605 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw492\" (UniqueName: \"kubernetes.io/projected/e39a0509-ea55-4b46-a3dc-473bb655cad8-kube-api-access-hw492\") pod \"auto-csr-approver-29556750-xl9g6\" (UID: \"e39a0509-ea55-4b46-a3dc-473bb655cad8\") " pod="openshift-infra/auto-csr-approver-29556750-xl9g6" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.343634 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76f7e035-5d7e-46a2-befe-a8e414e93d86-config-volume\") pod \"collect-profiles-29556750-l24ln\" (UID: \"76f7e035-5d7e-46a2-befe-a8e414e93d86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.344686 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76f7e035-5d7e-46a2-befe-a8e414e93d86-config-volume\") pod \"collect-profiles-29556750-l24ln\" (UID: \"76f7e035-5d7e-46a2-befe-a8e414e93d86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.350555 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76f7e035-5d7e-46a2-befe-a8e414e93d86-secret-volume\") pod \"collect-profiles-29556750-l24ln\" (UID: \"76f7e035-5d7e-46a2-befe-a8e414e93d86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.362425 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qjw7\" (UniqueName: \"kubernetes.io/projected/76f7e035-5d7e-46a2-befe-a8e414e93d86-kube-api-access-4qjw7\") pod \"collect-profiles-29556750-l24ln\" (UID: \"76f7e035-5d7e-46a2-befe-a8e414e93d86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.363004 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw492\" (UniqueName: \"kubernetes.io/projected/e39a0509-ea55-4b46-a3dc-473bb655cad8-kube-api-access-hw492\") pod \"auto-csr-approver-29556750-xl9g6\" (UID: \"e39a0509-ea55-4b46-a3dc-473bb655cad8\") " pod="openshift-infra/auto-csr-approver-29556750-xl9g6" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.572391 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556750-xl9g6" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.581220 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.739775 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.757346 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:30:01 crc kubenswrapper[4837]: W0313 12:30:01.025117 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode39a0509_ea55_4b46_a3dc_473bb655cad8.slice/crio-e02b5a85207cd574f3864bf7eedaf6214d2f2763660958e5efd2708fc7a806eb WatchSource:0}: Error finding container e02b5a85207cd574f3864bf7eedaf6214d2f2763660958e5efd2708fc7a806eb: Status 404 returned error can't find the container with id e02b5a85207cd574f3864bf7eedaf6214d2f2763660958e5efd2708fc7a806eb Mar 13 12:30:01 crc kubenswrapper[4837]: I0313 12:30:01.026361 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556750-xl9g6"] Mar 13 12:30:01 crc kubenswrapper[4837]: W0313 12:30:01.092051 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76f7e035_5d7e_46a2_befe_a8e414e93d86.slice/crio-c89b83aad1ee935ee8dd45e177b8e6ce9366ca2dcd96dae48ec255b4f9217c3f WatchSource:0}: Error finding container c89b83aad1ee935ee8dd45e177b8e6ce9366ca2dcd96dae48ec255b4f9217c3f: Status 404 returned error can't find the container with id c89b83aad1ee935ee8dd45e177b8e6ce9366ca2dcd96dae48ec255b4f9217c3f Mar 13 12:30:01 crc kubenswrapper[4837]: I0313 12:30:01.094596 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln"] Mar 13 12:30:01 crc kubenswrapper[4837]: I0313 12:30:01.603290 4837 generic.go:334] "Generic (PLEG): container finished" podID="76f7e035-5d7e-46a2-befe-a8e414e93d86" containerID="9ebad436416607ee3afe8e629996f2bb4d9f2b83ecead032b698d990edb417f1" exitCode=0 Mar 13 12:30:01 crc kubenswrapper[4837]: I0313 12:30:01.603608 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" event={"ID":"76f7e035-5d7e-46a2-befe-a8e414e93d86","Type":"ContainerDied","Data":"9ebad436416607ee3afe8e629996f2bb4d9f2b83ecead032b698d990edb417f1"} Mar 13 12:30:01 crc kubenswrapper[4837]: I0313 12:30:01.603656 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" event={"ID":"76f7e035-5d7e-46a2-befe-a8e414e93d86","Type":"ContainerStarted","Data":"c89b83aad1ee935ee8dd45e177b8e6ce9366ca2dcd96dae48ec255b4f9217c3f"} Mar 13 12:30:01 crc kubenswrapper[4837]: I0313 12:30:01.606099 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"66bdda91-c5b6-4879-9adf-21846884c797","Type":"ContainerStarted","Data":"e0109150fdc9bce6fc2a2f4d23a6692ef997ab608b50f1cec0fb2562f9d86611"} Mar 13 12:30:01 crc kubenswrapper[4837]: I0313 12:30:01.608016 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556750-xl9g6" event={"ID":"e39a0509-ea55-4b46-a3dc-473bb655cad8","Type":"ContainerStarted","Data":"e02b5a85207cd574f3864bf7eedaf6214d2f2763660958e5efd2708fc7a806eb"} Mar 13 12:30:02 crc kubenswrapper[4837]: I0313 12:30:02.956698 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:03 crc kubenswrapper[4837]: I0313 12:30:03.108289 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76f7e035-5d7e-46a2-befe-a8e414e93d86-secret-volume\") pod \"76f7e035-5d7e-46a2-befe-a8e414e93d86\" (UID: \"76f7e035-5d7e-46a2-befe-a8e414e93d86\") " Mar 13 12:30:03 crc kubenswrapper[4837]: I0313 12:30:03.108791 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76f7e035-5d7e-46a2-befe-a8e414e93d86-config-volume\") pod \"76f7e035-5d7e-46a2-befe-a8e414e93d86\" (UID: \"76f7e035-5d7e-46a2-befe-a8e414e93d86\") " Mar 13 12:30:03 crc kubenswrapper[4837]: I0313 12:30:03.109018 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qjw7\" (UniqueName: \"kubernetes.io/projected/76f7e035-5d7e-46a2-befe-a8e414e93d86-kube-api-access-4qjw7\") pod \"76f7e035-5d7e-46a2-befe-a8e414e93d86\" (UID: \"76f7e035-5d7e-46a2-befe-a8e414e93d86\") " Mar 13 12:30:03 crc kubenswrapper[4837]: I0313 12:30:03.110030 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76f7e035-5d7e-46a2-befe-a8e414e93d86-config-volume" (OuterVolumeSpecName: "config-volume") pod "76f7e035-5d7e-46a2-befe-a8e414e93d86" (UID: "76f7e035-5d7e-46a2-befe-a8e414e93d86"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:30:03 crc kubenswrapper[4837]: I0313 12:30:03.117009 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76f7e035-5d7e-46a2-befe-a8e414e93d86-kube-api-access-4qjw7" (OuterVolumeSpecName: "kube-api-access-4qjw7") pod "76f7e035-5d7e-46a2-befe-a8e414e93d86" (UID: "76f7e035-5d7e-46a2-befe-a8e414e93d86"). InnerVolumeSpecName "kube-api-access-4qjw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:30:03 crc kubenswrapper[4837]: I0313 12:30:03.117159 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76f7e035-5d7e-46a2-befe-a8e414e93d86-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "76f7e035-5d7e-46a2-befe-a8e414e93d86" (UID: "76f7e035-5d7e-46a2-befe-a8e414e93d86"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:30:03 crc kubenswrapper[4837]: I0313 12:30:03.211830 4837 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76f7e035-5d7e-46a2-befe-a8e414e93d86-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 12:30:03 crc kubenswrapper[4837]: I0313 12:30:03.211868 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76f7e035-5d7e-46a2-befe-a8e414e93d86-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 12:30:03 crc kubenswrapper[4837]: I0313 12:30:03.211878 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qjw7\" (UniqueName: \"kubernetes.io/projected/76f7e035-5d7e-46a2-befe-a8e414e93d86-kube-api-access-4qjw7\") on node \"crc\" DevicePath \"\"" Mar 13 12:30:03 crc kubenswrapper[4837]: I0313 12:30:03.635519 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" event={"ID":"76f7e035-5d7e-46a2-befe-a8e414e93d86","Type":"ContainerDied","Data":"c89b83aad1ee935ee8dd45e177b8e6ce9366ca2dcd96dae48ec255b4f9217c3f"} Mar 13 12:30:03 crc kubenswrapper[4837]: I0313 12:30:03.635570 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c89b83aad1ee935ee8dd45e177b8e6ce9366ca2dcd96dae48ec255b4f9217c3f" Mar 13 12:30:03 crc kubenswrapper[4837]: I0313 12:30:03.635583 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:04 crc kubenswrapper[4837]: I0313 12:30:04.024882 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr"] Mar 13 12:30:04 crc kubenswrapper[4837]: I0313 12:30:04.032409 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr"] Mar 13 12:30:05 crc kubenswrapper[4837]: I0313 12:30:05.087107 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:30:05 crc kubenswrapper[4837]: E0313 12:30:05.087376 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:30:05 crc kubenswrapper[4837]: I0313 12:30:05.087710 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="831db5b2-5229-4b52-8783-f99c640ba856" path="/var/lib/kubelet/pods/831db5b2-5229-4b52-8783-f99c640ba856/volumes" Mar 13 12:30:08 crc kubenswrapper[4837]: I0313 12:30:08.691400 4837 generic.go:334] "Generic (PLEG): container finished" podID="e39a0509-ea55-4b46-a3dc-473bb655cad8" containerID="b4cb982598c9648b581b684b81629524a6916bc2574ae740b552fa7040fb8d2e" exitCode=0 Mar 13 12:30:08 crc kubenswrapper[4837]: I0313 12:30:08.691654 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556750-xl9g6" event={"ID":"e39a0509-ea55-4b46-a3dc-473bb655cad8","Type":"ContainerDied","Data":"b4cb982598c9648b581b684b81629524a6916bc2574ae740b552fa7040fb8d2e"} Mar 13 12:30:10 crc kubenswrapper[4837]: I0313 12:30:10.072390 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556750-xl9g6" Mar 13 12:30:10 crc kubenswrapper[4837]: I0313 12:30:10.156854 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw492\" (UniqueName: \"kubernetes.io/projected/e39a0509-ea55-4b46-a3dc-473bb655cad8-kube-api-access-hw492\") pod \"e39a0509-ea55-4b46-a3dc-473bb655cad8\" (UID: \"e39a0509-ea55-4b46-a3dc-473bb655cad8\") " Mar 13 12:30:10 crc kubenswrapper[4837]: I0313 12:30:10.164262 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e39a0509-ea55-4b46-a3dc-473bb655cad8-kube-api-access-hw492" (OuterVolumeSpecName: "kube-api-access-hw492") pod "e39a0509-ea55-4b46-a3dc-473bb655cad8" (UID: "e39a0509-ea55-4b46-a3dc-473bb655cad8"). InnerVolumeSpecName "kube-api-access-hw492". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:30:10 crc kubenswrapper[4837]: I0313 12:30:10.259653 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw492\" (UniqueName: \"kubernetes.io/projected/e39a0509-ea55-4b46-a3dc-473bb655cad8-kube-api-access-hw492\") on node \"crc\" DevicePath \"\"" Mar 13 12:30:10 crc kubenswrapper[4837]: I0313 12:30:10.713415 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556750-xl9g6" event={"ID":"e39a0509-ea55-4b46-a3dc-473bb655cad8","Type":"ContainerDied","Data":"e02b5a85207cd574f3864bf7eedaf6214d2f2763660958e5efd2708fc7a806eb"} Mar 13 12:30:10 crc kubenswrapper[4837]: I0313 12:30:10.713472 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e02b5a85207cd574f3864bf7eedaf6214d2f2763660958e5efd2708fc7a806eb" Mar 13 12:30:10 crc kubenswrapper[4837]: I0313 12:30:10.713482 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556750-xl9g6" Mar 13 12:30:11 crc kubenswrapper[4837]: I0313 12:30:11.163155 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556744-m59fh"] Mar 13 12:30:11 crc kubenswrapper[4837]: I0313 12:30:11.182674 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556744-m59fh"] Mar 13 12:30:13 crc kubenswrapper[4837]: I0313 12:30:13.063201 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e934250-1bb4-41fe-b36e-2acf48194bcf" path="/var/lib/kubelet/pods/1e934250-1bb4-41fe-b36e-2acf48194bcf/volumes" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.189542 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-28bfn"] Mar 13 12:30:16 crc kubenswrapper[4837]: E0313 12:30:16.190294 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76f7e035-5d7e-46a2-befe-a8e414e93d86" containerName="collect-profiles" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.190310 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="76f7e035-5d7e-46a2-befe-a8e414e93d86" containerName="collect-profiles" Mar 13 12:30:16 crc kubenswrapper[4837]: E0313 12:30:16.190344 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e39a0509-ea55-4b46-a3dc-473bb655cad8" containerName="oc" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.190351 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e39a0509-ea55-4b46-a3dc-473bb655cad8" containerName="oc" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.190566 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e39a0509-ea55-4b46-a3dc-473bb655cad8" containerName="oc" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.190588 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="76f7e035-5d7e-46a2-befe-a8e414e93d86" containerName="collect-profiles" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.193451 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.204383 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-28bfn"] Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.278204 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27ee91bc-e007-4c68-99b5-34c7d0582011-utilities\") pod \"certified-operators-28bfn\" (UID: \"27ee91bc-e007-4c68-99b5-34c7d0582011\") " pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.278276 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bbsb\" (UniqueName: \"kubernetes.io/projected/27ee91bc-e007-4c68-99b5-34c7d0582011-kube-api-access-8bbsb\") pod \"certified-operators-28bfn\" (UID: \"27ee91bc-e007-4c68-99b5-34c7d0582011\") " pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.278347 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27ee91bc-e007-4c68-99b5-34c7d0582011-catalog-content\") pod \"certified-operators-28bfn\" (UID: \"27ee91bc-e007-4c68-99b5-34c7d0582011\") " pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.382484 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27ee91bc-e007-4c68-99b5-34c7d0582011-utilities\") pod \"certified-operators-28bfn\" (UID: \"27ee91bc-e007-4c68-99b5-34c7d0582011\") " pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.382872 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bbsb\" (UniqueName: \"kubernetes.io/projected/27ee91bc-e007-4c68-99b5-34c7d0582011-kube-api-access-8bbsb\") pod \"certified-operators-28bfn\" (UID: \"27ee91bc-e007-4c68-99b5-34c7d0582011\") " pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.383160 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27ee91bc-e007-4c68-99b5-34c7d0582011-utilities\") pod \"certified-operators-28bfn\" (UID: \"27ee91bc-e007-4c68-99b5-34c7d0582011\") " pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.383750 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27ee91bc-e007-4c68-99b5-34c7d0582011-catalog-content\") pod \"certified-operators-28bfn\" (UID: \"27ee91bc-e007-4c68-99b5-34c7d0582011\") " pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.384230 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27ee91bc-e007-4c68-99b5-34c7d0582011-catalog-content\") pod \"certified-operators-28bfn\" (UID: \"27ee91bc-e007-4c68-99b5-34c7d0582011\") " pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.424108 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bbsb\" (UniqueName: \"kubernetes.io/projected/27ee91bc-e007-4c68-99b5-34c7d0582011-kube-api-access-8bbsb\") pod \"certified-operators-28bfn\" (UID: \"27ee91bc-e007-4c68-99b5-34c7d0582011\") " pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.534660 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:20 crc kubenswrapper[4837]: I0313 12:30:20.048246 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:30:20 crc kubenswrapper[4837]: E0313 12:30:20.048801 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:30:30 crc kubenswrapper[4837]: I0313 12:30:30.544911 4837 scope.go:117] "RemoveContainer" containerID="965aad43c7ccd189d4d18246f935c745fc24b5e2cfb5b07896f9492e9109fb55" Mar 13 12:30:30 crc kubenswrapper[4837]: E0313 12:30:30.586238 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 13 12:30:30 crc kubenswrapper[4837]: E0313 12:30:30.586418 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zdksm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(66bdda91-c5b6-4879-9adf-21846884c797): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:30:30 crc kubenswrapper[4837]: E0313 12:30:30.587666 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="66bdda91-c5b6-4879-9adf-21846884c797" Mar 13 12:30:30 crc kubenswrapper[4837]: I0313 12:30:30.635084 4837 scope.go:117] "RemoveContainer" containerID="5d6f6eb18121de7ac4f3538b881026fd87404ecae22fe7e8d631b874d26990e4" Mar 13 12:30:30 crc kubenswrapper[4837]: E0313 12:30:30.878982 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="66bdda91-c5b6-4879-9adf-21846884c797" Mar 13 12:30:30 crc kubenswrapper[4837]: I0313 12:30:30.937797 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-28bfn"] Mar 13 12:30:30 crc kubenswrapper[4837]: W0313 12:30:30.939465 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27ee91bc_e007_4c68_99b5_34c7d0582011.slice/crio-5fc0ef45e263712833cb4f519304d7593121798056d551c869c283dc2e6747e6 WatchSource:0}: Error finding container 5fc0ef45e263712833cb4f519304d7593121798056d551c869c283dc2e6747e6: Status 404 returned error can't find the container with id 5fc0ef45e263712833cb4f519304d7593121798056d551c869c283dc2e6747e6 Mar 13 12:30:31 crc kubenswrapper[4837]: I0313 12:30:31.887537 4837 generic.go:334] "Generic (PLEG): container finished" podID="27ee91bc-e007-4c68-99b5-34c7d0582011" containerID="93168fca68a8d6317f60d8aeaccee8e87ed38a7a55bd16d7de59d1bf8ee00212" exitCode=0 Mar 13 12:30:31 crc kubenswrapper[4837]: I0313 12:30:31.887752 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28bfn" event={"ID":"27ee91bc-e007-4c68-99b5-34c7d0582011","Type":"ContainerDied","Data":"93168fca68a8d6317f60d8aeaccee8e87ed38a7a55bd16d7de59d1bf8ee00212"} Mar 13 12:30:31 crc kubenswrapper[4837]: I0313 12:30:31.887864 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28bfn" event={"ID":"27ee91bc-e007-4c68-99b5-34c7d0582011","Type":"ContainerStarted","Data":"5fc0ef45e263712833cb4f519304d7593121798056d551c869c283dc2e6747e6"} Mar 13 12:30:33 crc kubenswrapper[4837]: I0313 12:30:33.906660 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28bfn" event={"ID":"27ee91bc-e007-4c68-99b5-34c7d0582011","Type":"ContainerStarted","Data":"a8175e7d0cec8aad075327704d3226eef42ece8f46fb741b77e0c3dfe1833f0f"} Mar 13 12:30:34 crc kubenswrapper[4837]: I0313 12:30:34.917303 4837 generic.go:334] "Generic (PLEG): container finished" podID="27ee91bc-e007-4c68-99b5-34c7d0582011" containerID="a8175e7d0cec8aad075327704d3226eef42ece8f46fb741b77e0c3dfe1833f0f" exitCode=0 Mar 13 12:30:34 crc kubenswrapper[4837]: I0313 12:30:34.917351 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28bfn" event={"ID":"27ee91bc-e007-4c68-99b5-34c7d0582011","Type":"ContainerDied","Data":"a8175e7d0cec8aad075327704d3226eef42ece8f46fb741b77e0c3dfe1833f0f"} Mar 13 12:30:35 crc kubenswrapper[4837]: I0313 12:30:35.054356 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:30:35 crc kubenswrapper[4837]: E0313 12:30:35.055044 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:30:35 crc kubenswrapper[4837]: I0313 12:30:35.931118 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28bfn" event={"ID":"27ee91bc-e007-4c68-99b5-34c7d0582011","Type":"ContainerStarted","Data":"3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd"} Mar 13 12:30:35 crc kubenswrapper[4837]: I0313 12:30:35.963980 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-28bfn" podStartSLOduration=16.251353086 podStartE2EDuration="19.963946567s" podCreationTimestamp="2026-03-13 12:30:16 +0000 UTC" firstStartedPulling="2026-03-13 12:30:31.889671157 +0000 UTC m=+2547.527937920" lastFinishedPulling="2026-03-13 12:30:35.602264638 +0000 UTC m=+2551.240531401" observedRunningTime="2026-03-13 12:30:35.961977574 +0000 UTC m=+2551.600244337" watchObservedRunningTime="2026-03-13 12:30:35.963946567 +0000 UTC m=+2551.602213330" Mar 13 12:30:36 crc kubenswrapper[4837]: I0313 12:30:36.534945 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:36 crc kubenswrapper[4837]: I0313 12:30:36.535003 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:37 crc kubenswrapper[4837]: I0313 12:30:37.583657 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-28bfn" podUID="27ee91bc-e007-4c68-99b5-34c7d0582011" containerName="registry-server" probeResult="failure" output=< Mar 13 12:30:37 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Mar 13 12:30:37 crc kubenswrapper[4837]: > Mar 13 12:30:45 crc kubenswrapper[4837]: I0313 12:30:45.500849 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 13 12:30:46 crc kubenswrapper[4837]: I0313 12:30:46.581157 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:46 crc kubenswrapper[4837]: I0313 12:30:46.630508 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:47 crc kubenswrapper[4837]: I0313 12:30:47.043023 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"66bdda91-c5b6-4879-9adf-21846884c797","Type":"ContainerStarted","Data":"bda10fa8fd12669f2f471650132835bc9a8231ba850dd11df31ebbad360b9cf6"} Mar 13 12:30:47 crc kubenswrapper[4837]: I0313 12:30:47.075380 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.334128991 podStartE2EDuration="49.075359151s" podCreationTimestamp="2026-03-13 12:29:58 +0000 UTC" firstStartedPulling="2026-03-13 12:30:00.756311218 +0000 UTC m=+2516.394577981" lastFinishedPulling="2026-03-13 12:30:45.497541378 +0000 UTC m=+2561.135808141" observedRunningTime="2026-03-13 12:30:47.059195012 +0000 UTC m=+2562.697461775" watchObservedRunningTime="2026-03-13 12:30:47.075359151 +0000 UTC m=+2562.713625914" Mar 13 12:30:47 crc kubenswrapper[4837]: I0313 12:30:47.394011 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-28bfn"] Mar 13 12:30:48 crc kubenswrapper[4837]: I0313 12:30:48.051013 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-28bfn" podUID="27ee91bc-e007-4c68-99b5-34c7d0582011" containerName="registry-server" containerID="cri-o://3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd" gracePeriod=2 Mar 13 12:30:48 crc kubenswrapper[4837]: I0313 12:30:48.498595 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:48 crc kubenswrapper[4837]: I0313 12:30:48.640750 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bbsb\" (UniqueName: \"kubernetes.io/projected/27ee91bc-e007-4c68-99b5-34c7d0582011-kube-api-access-8bbsb\") pod \"27ee91bc-e007-4c68-99b5-34c7d0582011\" (UID: \"27ee91bc-e007-4c68-99b5-34c7d0582011\") " Mar 13 12:30:48 crc kubenswrapper[4837]: I0313 12:30:48.640951 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27ee91bc-e007-4c68-99b5-34c7d0582011-utilities\") pod \"27ee91bc-e007-4c68-99b5-34c7d0582011\" (UID: \"27ee91bc-e007-4c68-99b5-34c7d0582011\") " Mar 13 12:30:48 crc kubenswrapper[4837]: I0313 12:30:48.641073 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27ee91bc-e007-4c68-99b5-34c7d0582011-catalog-content\") pod \"27ee91bc-e007-4c68-99b5-34c7d0582011\" (UID: \"27ee91bc-e007-4c68-99b5-34c7d0582011\") " Mar 13 12:30:48 crc kubenswrapper[4837]: I0313 12:30:48.641980 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27ee91bc-e007-4c68-99b5-34c7d0582011-utilities" (OuterVolumeSpecName: "utilities") pod "27ee91bc-e007-4c68-99b5-34c7d0582011" (UID: "27ee91bc-e007-4c68-99b5-34c7d0582011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:30:48 crc kubenswrapper[4837]: I0313 12:30:48.646909 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27ee91bc-e007-4c68-99b5-34c7d0582011-kube-api-access-8bbsb" (OuterVolumeSpecName: "kube-api-access-8bbsb") pod "27ee91bc-e007-4c68-99b5-34c7d0582011" (UID: "27ee91bc-e007-4c68-99b5-34c7d0582011"). InnerVolumeSpecName "kube-api-access-8bbsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:30:48 crc kubenswrapper[4837]: I0313 12:30:48.705260 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27ee91bc-e007-4c68-99b5-34c7d0582011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27ee91bc-e007-4c68-99b5-34c7d0582011" (UID: "27ee91bc-e007-4c68-99b5-34c7d0582011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:30:48 crc kubenswrapper[4837]: I0313 12:30:48.743174 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bbsb\" (UniqueName: \"kubernetes.io/projected/27ee91bc-e007-4c68-99b5-34c7d0582011-kube-api-access-8bbsb\") on node \"crc\" DevicePath \"\"" Mar 13 12:30:48 crc kubenswrapper[4837]: I0313 12:30:48.743208 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27ee91bc-e007-4c68-99b5-34c7d0582011-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:30:48 crc kubenswrapper[4837]: I0313 12:30:48.743218 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27ee91bc-e007-4c68-99b5-34c7d0582011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.065432 4837 generic.go:334] "Generic (PLEG): container finished" podID="27ee91bc-e007-4c68-99b5-34c7d0582011" containerID="3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd" exitCode=0 Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.065480 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28bfn" event={"ID":"27ee91bc-e007-4c68-99b5-34c7d0582011","Type":"ContainerDied","Data":"3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd"} Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.065509 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28bfn" event={"ID":"27ee91bc-e007-4c68-99b5-34c7d0582011","Type":"ContainerDied","Data":"5fc0ef45e263712833cb4f519304d7593121798056d551c869c283dc2e6747e6"} Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.065527 4837 scope.go:117] "RemoveContainer" containerID="3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd" Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.065700 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.109074 4837 scope.go:117] "RemoveContainer" containerID="a8175e7d0cec8aad075327704d3226eef42ece8f46fb741b77e0c3dfe1833f0f" Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.117313 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-28bfn"] Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.133627 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-28bfn"] Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.152516 4837 scope.go:117] "RemoveContainer" containerID="93168fca68a8d6317f60d8aeaccee8e87ed38a7a55bd16d7de59d1bf8ee00212" Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.175320 4837 scope.go:117] "RemoveContainer" containerID="3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd" Mar 13 12:30:49 crc kubenswrapper[4837]: E0313 12:30:49.176155 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd\": container with ID starting with 3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd not found: ID does not exist" containerID="3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd" Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.176202 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd"} err="failed to get container status \"3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd\": rpc error: code = NotFound desc = could not find container \"3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd\": container with ID starting with 3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd not found: ID does not exist" Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.176228 4837 scope.go:117] "RemoveContainer" containerID="a8175e7d0cec8aad075327704d3226eef42ece8f46fb741b77e0c3dfe1833f0f" Mar 13 12:30:49 crc kubenswrapper[4837]: E0313 12:30:49.176579 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8175e7d0cec8aad075327704d3226eef42ece8f46fb741b77e0c3dfe1833f0f\": container with ID starting with a8175e7d0cec8aad075327704d3226eef42ece8f46fb741b77e0c3dfe1833f0f not found: ID does not exist" containerID="a8175e7d0cec8aad075327704d3226eef42ece8f46fb741b77e0c3dfe1833f0f" Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.176608 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8175e7d0cec8aad075327704d3226eef42ece8f46fb741b77e0c3dfe1833f0f"} err="failed to get container status \"a8175e7d0cec8aad075327704d3226eef42ece8f46fb741b77e0c3dfe1833f0f\": rpc error: code = NotFound desc = could not find container \"a8175e7d0cec8aad075327704d3226eef42ece8f46fb741b77e0c3dfe1833f0f\": container with ID starting with a8175e7d0cec8aad075327704d3226eef42ece8f46fb741b77e0c3dfe1833f0f not found: ID does not exist" Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.176626 4837 scope.go:117] "RemoveContainer" containerID="93168fca68a8d6317f60d8aeaccee8e87ed38a7a55bd16d7de59d1bf8ee00212" Mar 13 12:30:49 crc kubenswrapper[4837]: E0313 12:30:49.177142 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93168fca68a8d6317f60d8aeaccee8e87ed38a7a55bd16d7de59d1bf8ee00212\": container with ID starting with 93168fca68a8d6317f60d8aeaccee8e87ed38a7a55bd16d7de59d1bf8ee00212 not found: ID does not exist" containerID="93168fca68a8d6317f60d8aeaccee8e87ed38a7a55bd16d7de59d1bf8ee00212" Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.177167 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93168fca68a8d6317f60d8aeaccee8e87ed38a7a55bd16d7de59d1bf8ee00212"} err="failed to get container status \"93168fca68a8d6317f60d8aeaccee8e87ed38a7a55bd16d7de59d1bf8ee00212\": rpc error: code = NotFound desc = could not find container \"93168fca68a8d6317f60d8aeaccee8e87ed38a7a55bd16d7de59d1bf8ee00212\": container with ID starting with 93168fca68a8d6317f60d8aeaccee8e87ed38a7a55bd16d7de59d1bf8ee00212 not found: ID does not exist" Mar 13 12:30:50 crc kubenswrapper[4837]: I0313 12:30:50.048571 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:30:50 crc kubenswrapper[4837]: E0313 12:30:50.049241 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:30:51 crc kubenswrapper[4837]: I0313 12:30:51.066388 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27ee91bc-e007-4c68-99b5-34c7d0582011" path="/var/lib/kubelet/pods/27ee91bc-e007-4c68-99b5-34c7d0582011/volumes" Mar 13 12:31:02 crc kubenswrapper[4837]: I0313 12:31:02.048150 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:31:02 crc kubenswrapper[4837]: E0313 12:31:02.049215 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:31:16 crc kubenswrapper[4837]: I0313 12:31:16.048129 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:31:16 crc kubenswrapper[4837]: I0313 12:31:16.321965 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"2b3696e623d9d39a462c11ad27c35b22549ca870996c4db75be20036c9201734"} Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.633927 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-548kp"] Mar 13 12:31:19 crc kubenswrapper[4837]: E0313 12:31:19.635502 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27ee91bc-e007-4c68-99b5-34c7d0582011" containerName="registry-server" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.635518 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="27ee91bc-e007-4c68-99b5-34c7d0582011" containerName="registry-server" Mar 13 12:31:19 crc kubenswrapper[4837]: E0313 12:31:19.635533 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27ee91bc-e007-4c68-99b5-34c7d0582011" containerName="extract-content" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.635540 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="27ee91bc-e007-4c68-99b5-34c7d0582011" containerName="extract-content" Mar 13 12:31:19 crc kubenswrapper[4837]: E0313 12:31:19.635567 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27ee91bc-e007-4c68-99b5-34c7d0582011" containerName="extract-utilities" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.635575 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="27ee91bc-e007-4c68-99b5-34c7d0582011" containerName="extract-utilities" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.635815 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="27ee91bc-e007-4c68-99b5-34c7d0582011" containerName="registry-server" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.637484 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.680687 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-548kp"] Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.742503 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44f44c76-3281-4bf0-af2e-0bba3d0dd712-catalog-content\") pod \"redhat-operators-548kp\" (UID: \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\") " pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.742702 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkpd6\" (UniqueName: \"kubernetes.io/projected/44f44c76-3281-4bf0-af2e-0bba3d0dd712-kube-api-access-qkpd6\") pod \"redhat-operators-548kp\" (UID: \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\") " pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.742854 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44f44c76-3281-4bf0-af2e-0bba3d0dd712-utilities\") pod \"redhat-operators-548kp\" (UID: \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\") " pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.844655 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkpd6\" (UniqueName: \"kubernetes.io/projected/44f44c76-3281-4bf0-af2e-0bba3d0dd712-kube-api-access-qkpd6\") pod \"redhat-operators-548kp\" (UID: \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\") " pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.844766 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44f44c76-3281-4bf0-af2e-0bba3d0dd712-utilities\") pod \"redhat-operators-548kp\" (UID: \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\") " pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.844900 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44f44c76-3281-4bf0-af2e-0bba3d0dd712-catalog-content\") pod \"redhat-operators-548kp\" (UID: \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\") " pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.845473 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44f44c76-3281-4bf0-af2e-0bba3d0dd712-catalog-content\") pod \"redhat-operators-548kp\" (UID: \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\") " pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.845525 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44f44c76-3281-4bf0-af2e-0bba3d0dd712-utilities\") pod \"redhat-operators-548kp\" (UID: \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\") " pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.870537 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkpd6\" (UniqueName: \"kubernetes.io/projected/44f44c76-3281-4bf0-af2e-0bba3d0dd712-kube-api-access-qkpd6\") pod \"redhat-operators-548kp\" (UID: \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\") " pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.958729 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:20 crc kubenswrapper[4837]: I0313 12:31:20.275717 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-548kp"] Mar 13 12:31:20 crc kubenswrapper[4837]: W0313 12:31:20.291009 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44f44c76_3281_4bf0_af2e_0bba3d0dd712.slice/crio-4f4bd9532e4e3bcde381fc829056a99c3b26dd353413052ef976af5de3db5976 WatchSource:0}: Error finding container 4f4bd9532e4e3bcde381fc829056a99c3b26dd353413052ef976af5de3db5976: Status 404 returned error can't find the container with id 4f4bd9532e4e3bcde381fc829056a99c3b26dd353413052ef976af5de3db5976 Mar 13 12:31:20 crc kubenswrapper[4837]: I0313 12:31:20.403361 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-548kp" event={"ID":"44f44c76-3281-4bf0-af2e-0bba3d0dd712","Type":"ContainerStarted","Data":"4f4bd9532e4e3bcde381fc829056a99c3b26dd353413052ef976af5de3db5976"} Mar 13 12:31:21 crc kubenswrapper[4837]: I0313 12:31:21.412028 4837 generic.go:334] "Generic (PLEG): container finished" podID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerID="f909d1a24957ee9bf34a08df5acbf22285940797d06b8e41029978786a4c1aae" exitCode=0 Mar 13 12:31:21 crc kubenswrapper[4837]: I0313 12:31:21.412077 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-548kp" event={"ID":"44f44c76-3281-4bf0-af2e-0bba3d0dd712","Type":"ContainerDied","Data":"f909d1a24957ee9bf34a08df5acbf22285940797d06b8e41029978786a4c1aae"} Mar 13 12:31:22 crc kubenswrapper[4837]: I0313 12:31:22.421713 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-548kp" event={"ID":"44f44c76-3281-4bf0-af2e-0bba3d0dd712","Type":"ContainerStarted","Data":"c025e0fcad011221373a532b0452eaa46808dfb3988ff1e006eac648a2c2453b"} Mar 13 12:31:27 crc kubenswrapper[4837]: I0313 12:31:27.468085 4837 generic.go:334] "Generic (PLEG): container finished" podID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerID="c025e0fcad011221373a532b0452eaa46808dfb3988ff1e006eac648a2c2453b" exitCode=0 Mar 13 12:31:27 crc kubenswrapper[4837]: I0313 12:31:27.468200 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-548kp" event={"ID":"44f44c76-3281-4bf0-af2e-0bba3d0dd712","Type":"ContainerDied","Data":"c025e0fcad011221373a532b0452eaa46808dfb3988ff1e006eac648a2c2453b"} Mar 13 12:31:28 crc kubenswrapper[4837]: I0313 12:31:28.479860 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-548kp" event={"ID":"44f44c76-3281-4bf0-af2e-0bba3d0dd712","Type":"ContainerStarted","Data":"15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4"} Mar 13 12:31:28 crc kubenswrapper[4837]: I0313 12:31:28.506518 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-548kp" podStartSLOduration=3.07011843 podStartE2EDuration="9.506497167s" podCreationTimestamp="2026-03-13 12:31:19 +0000 UTC" firstStartedPulling="2026-03-13 12:31:21.414204686 +0000 UTC m=+2597.052471439" lastFinishedPulling="2026-03-13 12:31:27.850583403 +0000 UTC m=+2603.488850176" observedRunningTime="2026-03-13 12:31:28.503387689 +0000 UTC m=+2604.141654472" watchObservedRunningTime="2026-03-13 12:31:28.506497167 +0000 UTC m=+2604.144763950" Mar 13 12:31:29 crc kubenswrapper[4837]: I0313 12:31:29.959353 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:29 crc kubenswrapper[4837]: I0313 12:31:29.959656 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:31 crc kubenswrapper[4837]: I0313 12:31:31.009313 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-548kp" podUID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerName="registry-server" probeResult="failure" output=< Mar 13 12:31:31 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Mar 13 12:31:31 crc kubenswrapper[4837]: > Mar 13 12:31:41 crc kubenswrapper[4837]: I0313 12:31:41.004033 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-548kp" podUID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerName="registry-server" probeResult="failure" output=< Mar 13 12:31:41 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Mar 13 12:31:41 crc kubenswrapper[4837]: > Mar 13 12:31:50 crc kubenswrapper[4837]: I0313 12:31:50.005756 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:50 crc kubenswrapper[4837]: I0313 12:31:50.056402 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:50 crc kubenswrapper[4837]: I0313 12:31:50.831972 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-548kp"] Mar 13 12:31:51 crc kubenswrapper[4837]: I0313 12:31:51.672486 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-548kp" podUID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerName="registry-server" containerID="cri-o://15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4" gracePeriod=2 Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.307614 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.441351 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44f44c76-3281-4bf0-af2e-0bba3d0dd712-catalog-content\") pod \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\" (UID: \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\") " Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.441565 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44f44c76-3281-4bf0-af2e-0bba3d0dd712-utilities\") pod \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\" (UID: \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\") " Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.441814 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkpd6\" (UniqueName: \"kubernetes.io/projected/44f44c76-3281-4bf0-af2e-0bba3d0dd712-kube-api-access-qkpd6\") pod \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\" (UID: \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\") " Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.442541 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44f44c76-3281-4bf0-af2e-0bba3d0dd712-utilities" (OuterVolumeSpecName: "utilities") pod "44f44c76-3281-4bf0-af2e-0bba3d0dd712" (UID: "44f44c76-3281-4bf0-af2e-0bba3d0dd712"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.447601 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44f44c76-3281-4bf0-af2e-0bba3d0dd712-kube-api-access-qkpd6" (OuterVolumeSpecName: "kube-api-access-qkpd6") pod "44f44c76-3281-4bf0-af2e-0bba3d0dd712" (UID: "44f44c76-3281-4bf0-af2e-0bba3d0dd712"). InnerVolumeSpecName "kube-api-access-qkpd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.544325 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44f44c76-3281-4bf0-af2e-0bba3d0dd712-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.544362 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkpd6\" (UniqueName: \"kubernetes.io/projected/44f44c76-3281-4bf0-af2e-0bba3d0dd712-kube-api-access-qkpd6\") on node \"crc\" DevicePath \"\"" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.578663 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44f44c76-3281-4bf0-af2e-0bba3d0dd712-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44f44c76-3281-4bf0-af2e-0bba3d0dd712" (UID: "44f44c76-3281-4bf0-af2e-0bba3d0dd712"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.646328 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44f44c76-3281-4bf0-af2e-0bba3d0dd712-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.683409 4837 generic.go:334] "Generic (PLEG): container finished" podID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerID="15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4" exitCode=0 Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.683453 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-548kp" event={"ID":"44f44c76-3281-4bf0-af2e-0bba3d0dd712","Type":"ContainerDied","Data":"15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4"} Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.684666 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-548kp" event={"ID":"44f44c76-3281-4bf0-af2e-0bba3d0dd712","Type":"ContainerDied","Data":"4f4bd9532e4e3bcde381fc829056a99c3b26dd353413052ef976af5de3db5976"} Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.683495 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.684740 4837 scope.go:117] "RemoveContainer" containerID="15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.711549 4837 scope.go:117] "RemoveContainer" containerID="c025e0fcad011221373a532b0452eaa46808dfb3988ff1e006eac648a2c2453b" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.727855 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-548kp"] Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.741325 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-548kp"] Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.744887 4837 scope.go:117] "RemoveContainer" containerID="f909d1a24957ee9bf34a08df5acbf22285940797d06b8e41029978786a4c1aae" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.778974 4837 scope.go:117] "RemoveContainer" containerID="15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4" Mar 13 12:31:52 crc kubenswrapper[4837]: E0313 12:31:52.779465 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4\": container with ID starting with 15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4 not found: ID does not exist" containerID="15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.779493 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4"} err="failed to get container status \"15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4\": rpc error: code = NotFound desc = could not find container \"15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4\": container with ID starting with 15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4 not found: ID does not exist" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.779514 4837 scope.go:117] "RemoveContainer" containerID="c025e0fcad011221373a532b0452eaa46808dfb3988ff1e006eac648a2c2453b" Mar 13 12:31:52 crc kubenswrapper[4837]: E0313 12:31:52.780004 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c025e0fcad011221373a532b0452eaa46808dfb3988ff1e006eac648a2c2453b\": container with ID starting with c025e0fcad011221373a532b0452eaa46808dfb3988ff1e006eac648a2c2453b not found: ID does not exist" containerID="c025e0fcad011221373a532b0452eaa46808dfb3988ff1e006eac648a2c2453b" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.780029 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c025e0fcad011221373a532b0452eaa46808dfb3988ff1e006eac648a2c2453b"} err="failed to get container status \"c025e0fcad011221373a532b0452eaa46808dfb3988ff1e006eac648a2c2453b\": rpc error: code = NotFound desc = could not find container \"c025e0fcad011221373a532b0452eaa46808dfb3988ff1e006eac648a2c2453b\": container with ID starting with c025e0fcad011221373a532b0452eaa46808dfb3988ff1e006eac648a2c2453b not found: ID does not exist" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.780043 4837 scope.go:117] "RemoveContainer" containerID="f909d1a24957ee9bf34a08df5acbf22285940797d06b8e41029978786a4c1aae" Mar 13 12:31:52 crc kubenswrapper[4837]: E0313 12:31:52.780294 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f909d1a24957ee9bf34a08df5acbf22285940797d06b8e41029978786a4c1aae\": container with ID starting with f909d1a24957ee9bf34a08df5acbf22285940797d06b8e41029978786a4c1aae not found: ID does not exist" containerID="f909d1a24957ee9bf34a08df5acbf22285940797d06b8e41029978786a4c1aae" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.780337 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f909d1a24957ee9bf34a08df5acbf22285940797d06b8e41029978786a4c1aae"} err="failed to get container status \"f909d1a24957ee9bf34a08df5acbf22285940797d06b8e41029978786a4c1aae\": rpc error: code = NotFound desc = could not find container \"f909d1a24957ee9bf34a08df5acbf22285940797d06b8e41029978786a4c1aae\": container with ID starting with f909d1a24957ee9bf34a08df5acbf22285940797d06b8e41029978786a4c1aae not found: ID does not exist" Mar 13 12:31:53 crc kubenswrapper[4837]: I0313 12:31:53.064149 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" path="/var/lib/kubelet/pods/44f44c76-3281-4bf0-af2e-0bba3d0dd712/volumes" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.142276 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556752-5zvbj"] Mar 13 12:32:00 crc kubenswrapper[4837]: E0313 12:32:00.143412 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerName="extract-utilities" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.143432 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerName="extract-utilities" Mar 13 12:32:00 crc kubenswrapper[4837]: E0313 12:32:00.143453 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerName="registry-server" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.143459 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerName="registry-server" Mar 13 12:32:00 crc kubenswrapper[4837]: E0313 12:32:00.143486 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerName="extract-content" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.143491 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerName="extract-content" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.143720 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerName="registry-server" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.144427 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556752-5zvbj" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.147116 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.147693 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.148521 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.150108 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556752-5zvbj"] Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.300301 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7gsn\" (UniqueName: \"kubernetes.io/projected/6b8e58c8-d8ed-4773-99bb-6b480514d2b8-kube-api-access-r7gsn\") pod \"auto-csr-approver-29556752-5zvbj\" (UID: \"6b8e58c8-d8ed-4773-99bb-6b480514d2b8\") " pod="openshift-infra/auto-csr-approver-29556752-5zvbj" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.402800 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7gsn\" (UniqueName: \"kubernetes.io/projected/6b8e58c8-d8ed-4773-99bb-6b480514d2b8-kube-api-access-r7gsn\") pod \"auto-csr-approver-29556752-5zvbj\" (UID: \"6b8e58c8-d8ed-4773-99bb-6b480514d2b8\") " pod="openshift-infra/auto-csr-approver-29556752-5zvbj" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.429364 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7gsn\" (UniqueName: \"kubernetes.io/projected/6b8e58c8-d8ed-4773-99bb-6b480514d2b8-kube-api-access-r7gsn\") pod \"auto-csr-approver-29556752-5zvbj\" (UID: \"6b8e58c8-d8ed-4773-99bb-6b480514d2b8\") " pod="openshift-infra/auto-csr-approver-29556752-5zvbj" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.468032 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556752-5zvbj" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.903356 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556752-5zvbj"] Mar 13 12:32:01 crc kubenswrapper[4837]: I0313 12:32:01.765916 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556752-5zvbj" event={"ID":"6b8e58c8-d8ed-4773-99bb-6b480514d2b8","Type":"ContainerStarted","Data":"ed83f04bc89e6bcbccb374bf4403804d23818bfe3d206cdaf46734815716f1cf"} Mar 13 12:32:02 crc kubenswrapper[4837]: I0313 12:32:02.777879 4837 generic.go:334] "Generic (PLEG): container finished" podID="6b8e58c8-d8ed-4773-99bb-6b480514d2b8" containerID="17c6f8f105ae4921e05f2c394ef1bdbce7049ed97fc5dda971790fd2b3b77a0d" exitCode=0 Mar 13 12:32:02 crc kubenswrapper[4837]: I0313 12:32:02.777933 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556752-5zvbj" event={"ID":"6b8e58c8-d8ed-4773-99bb-6b480514d2b8","Type":"ContainerDied","Data":"17c6f8f105ae4921e05f2c394ef1bdbce7049ed97fc5dda971790fd2b3b77a0d"} Mar 13 12:32:04 crc kubenswrapper[4837]: I0313 12:32:04.152261 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556752-5zvbj" Mar 13 12:32:04 crc kubenswrapper[4837]: I0313 12:32:04.188836 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7gsn\" (UniqueName: \"kubernetes.io/projected/6b8e58c8-d8ed-4773-99bb-6b480514d2b8-kube-api-access-r7gsn\") pod \"6b8e58c8-d8ed-4773-99bb-6b480514d2b8\" (UID: \"6b8e58c8-d8ed-4773-99bb-6b480514d2b8\") " Mar 13 12:32:04 crc kubenswrapper[4837]: I0313 12:32:04.193912 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b8e58c8-d8ed-4773-99bb-6b480514d2b8-kube-api-access-r7gsn" (OuterVolumeSpecName: "kube-api-access-r7gsn") pod "6b8e58c8-d8ed-4773-99bb-6b480514d2b8" (UID: "6b8e58c8-d8ed-4773-99bb-6b480514d2b8"). InnerVolumeSpecName "kube-api-access-r7gsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:32:04 crc kubenswrapper[4837]: I0313 12:32:04.290892 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7gsn\" (UniqueName: \"kubernetes.io/projected/6b8e58c8-d8ed-4773-99bb-6b480514d2b8-kube-api-access-r7gsn\") on node \"crc\" DevicePath \"\"" Mar 13 12:32:04 crc kubenswrapper[4837]: I0313 12:32:04.796386 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556752-5zvbj" event={"ID":"6b8e58c8-d8ed-4773-99bb-6b480514d2b8","Type":"ContainerDied","Data":"ed83f04bc89e6bcbccb374bf4403804d23818bfe3d206cdaf46734815716f1cf"} Mar 13 12:32:04 crc kubenswrapper[4837]: I0313 12:32:04.796436 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed83f04bc89e6bcbccb374bf4403804d23818bfe3d206cdaf46734815716f1cf" Mar 13 12:32:04 crc kubenswrapper[4837]: I0313 12:32:04.796452 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556752-5zvbj" Mar 13 12:32:05 crc kubenswrapper[4837]: I0313 12:32:05.226953 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556746-vwjkq"] Mar 13 12:32:05 crc kubenswrapper[4837]: I0313 12:32:05.236556 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556746-vwjkq"] Mar 13 12:32:07 crc kubenswrapper[4837]: I0313 12:32:07.059258 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb9a9c7b-13fc-4655-91b2-a388c3870bf8" path="/var/lib/kubelet/pods/eb9a9c7b-13fc-4655-91b2-a388c3870bf8/volumes" Mar 13 12:32:30 crc kubenswrapper[4837]: I0313 12:32:30.775607 4837 scope.go:117] "RemoveContainer" containerID="f67572c8c6ce19fc30d5a363241b6294efe0fe117e547d75720e88fd9323c357" Mar 13 12:32:30 crc kubenswrapper[4837]: I0313 12:32:30.801551 4837 scope.go:117] "RemoveContainer" containerID="7b62114542f297a4d3e9e2cc215c273a290ed34518b0790734229b78c1fdfc3c" Mar 13 12:32:30 crc kubenswrapper[4837]: I0313 12:32:30.843322 4837 scope.go:117] "RemoveContainer" containerID="27d05aedac81655ab98a132d059aa69f642170fd7305465ba1bc55dadd819af6" Mar 13 12:32:30 crc kubenswrapper[4837]: I0313 12:32:30.903307 4837 scope.go:117] "RemoveContainer" containerID="a5eee508b483e64f809508b03aa1f0b24998bdde6d37da5807abe3cdc59f087e" Mar 13 12:33:35 crc kubenswrapper[4837]: I0313 12:33:35.484056 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:33:35 crc kubenswrapper[4837]: I0313 12:33:35.484631 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:34:00 crc kubenswrapper[4837]: I0313 12:34:00.146211 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556754-kvwpx"] Mar 13 12:34:00 crc kubenswrapper[4837]: E0313 12:34:00.147985 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b8e58c8-d8ed-4773-99bb-6b480514d2b8" containerName="oc" Mar 13 12:34:00 crc kubenswrapper[4837]: I0313 12:34:00.148013 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b8e58c8-d8ed-4773-99bb-6b480514d2b8" containerName="oc" Mar 13 12:34:00 crc kubenswrapper[4837]: I0313 12:34:00.148255 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b8e58c8-d8ed-4773-99bb-6b480514d2b8" containerName="oc" Mar 13 12:34:00 crc kubenswrapper[4837]: I0313 12:34:00.149232 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556754-kvwpx" Mar 13 12:34:00 crc kubenswrapper[4837]: I0313 12:34:00.151215 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:34:00 crc kubenswrapper[4837]: I0313 12:34:00.151908 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:34:00 crc kubenswrapper[4837]: I0313 12:34:00.152578 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:34:00 crc kubenswrapper[4837]: I0313 12:34:00.156713 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556754-kvwpx"] Mar 13 12:34:00 crc kubenswrapper[4837]: I0313 12:34:00.200784 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdknf\" (UniqueName: \"kubernetes.io/projected/41916e77-60c2-4138-b622-003f267ac74e-kube-api-access-wdknf\") pod \"auto-csr-approver-29556754-kvwpx\" (UID: \"41916e77-60c2-4138-b622-003f267ac74e\") " pod="openshift-infra/auto-csr-approver-29556754-kvwpx" Mar 13 12:34:00 crc kubenswrapper[4837]: I0313 12:34:00.302987 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdknf\" (UniqueName: \"kubernetes.io/projected/41916e77-60c2-4138-b622-003f267ac74e-kube-api-access-wdknf\") pod \"auto-csr-approver-29556754-kvwpx\" (UID: \"41916e77-60c2-4138-b622-003f267ac74e\") " pod="openshift-infra/auto-csr-approver-29556754-kvwpx" Mar 13 12:34:00 crc kubenswrapper[4837]: I0313 12:34:00.324944 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdknf\" (UniqueName: \"kubernetes.io/projected/41916e77-60c2-4138-b622-003f267ac74e-kube-api-access-wdknf\") pod \"auto-csr-approver-29556754-kvwpx\" (UID: \"41916e77-60c2-4138-b622-003f267ac74e\") " pod="openshift-infra/auto-csr-approver-29556754-kvwpx" Mar 13 12:34:00 crc kubenswrapper[4837]: I0313 12:34:00.468917 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556754-kvwpx" Mar 13 12:34:01 crc kubenswrapper[4837]: I0313 12:34:01.017020 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556754-kvwpx"] Mar 13 12:34:01 crc kubenswrapper[4837]: I0313 12:34:01.862995 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556754-kvwpx" event={"ID":"41916e77-60c2-4138-b622-003f267ac74e","Type":"ContainerStarted","Data":"0d0adfbe072e6cb409e30cd6bb9c285f58303b5968ad08015120f05eedfced64"} Mar 13 12:34:02 crc kubenswrapper[4837]: I0313 12:34:02.871909 4837 generic.go:334] "Generic (PLEG): container finished" podID="41916e77-60c2-4138-b622-003f267ac74e" containerID="d7ac07201d91371d67830ee944bf685a8cc89e1603bec505a0f5a81676127cf5" exitCode=0 Mar 13 12:34:02 crc kubenswrapper[4837]: I0313 12:34:02.871987 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556754-kvwpx" event={"ID":"41916e77-60c2-4138-b622-003f267ac74e","Type":"ContainerDied","Data":"d7ac07201d91371d67830ee944bf685a8cc89e1603bec505a0f5a81676127cf5"} Mar 13 12:34:04 crc kubenswrapper[4837]: I0313 12:34:04.288216 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556754-kvwpx" Mar 13 12:34:04 crc kubenswrapper[4837]: I0313 12:34:04.380399 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdknf\" (UniqueName: \"kubernetes.io/projected/41916e77-60c2-4138-b622-003f267ac74e-kube-api-access-wdknf\") pod \"41916e77-60c2-4138-b622-003f267ac74e\" (UID: \"41916e77-60c2-4138-b622-003f267ac74e\") " Mar 13 12:34:04 crc kubenswrapper[4837]: I0313 12:34:04.387116 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41916e77-60c2-4138-b622-003f267ac74e-kube-api-access-wdknf" (OuterVolumeSpecName: "kube-api-access-wdknf") pod "41916e77-60c2-4138-b622-003f267ac74e" (UID: "41916e77-60c2-4138-b622-003f267ac74e"). InnerVolumeSpecName "kube-api-access-wdknf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:34:04 crc kubenswrapper[4837]: I0313 12:34:04.483088 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdknf\" (UniqueName: \"kubernetes.io/projected/41916e77-60c2-4138-b622-003f267ac74e-kube-api-access-wdknf\") on node \"crc\" DevicePath \"\"" Mar 13 12:34:04 crc kubenswrapper[4837]: I0313 12:34:04.902543 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556754-kvwpx" event={"ID":"41916e77-60c2-4138-b622-003f267ac74e","Type":"ContainerDied","Data":"0d0adfbe072e6cb409e30cd6bb9c285f58303b5968ad08015120f05eedfced64"} Mar 13 12:34:04 crc kubenswrapper[4837]: I0313 12:34:04.902889 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d0adfbe072e6cb409e30cd6bb9c285f58303b5968ad08015120f05eedfced64" Mar 13 12:34:04 crc kubenswrapper[4837]: I0313 12:34:04.902712 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556754-kvwpx" Mar 13 12:34:05 crc kubenswrapper[4837]: I0313 12:34:05.352939 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556748-dgqh7"] Mar 13 12:34:05 crc kubenswrapper[4837]: I0313 12:34:05.359892 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556748-dgqh7"] Mar 13 12:34:05 crc kubenswrapper[4837]: I0313 12:34:05.484114 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:34:05 crc kubenswrapper[4837]: I0313 12:34:05.484183 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:34:07 crc kubenswrapper[4837]: I0313 12:34:07.059070 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72d56baf-1d17-4cbb-a351-8f5bf373c768" path="/var/lib/kubelet/pods/72d56baf-1d17-4cbb-a351-8f5bf373c768/volumes" Mar 13 12:34:31 crc kubenswrapper[4837]: I0313 12:34:30.999796 4837 scope.go:117] "RemoveContainer" containerID="1f6ecae5057b8984cf0bda7716bd86797af96a5f4c3a84ef8f5f85cb3c1def23" Mar 13 12:34:35 crc kubenswrapper[4837]: I0313 12:34:35.483924 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:34:35 crc kubenswrapper[4837]: I0313 12:34:35.484580 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:34:35 crc kubenswrapper[4837]: I0313 12:34:35.484666 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 12:34:35 crc kubenswrapper[4837]: I0313 12:34:35.485422 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2b3696e623d9d39a462c11ad27c35b22549ca870996c4db75be20036c9201734"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:34:35 crc kubenswrapper[4837]: I0313 12:34:35.485494 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://2b3696e623d9d39a462c11ad27c35b22549ca870996c4db75be20036c9201734" gracePeriod=600 Mar 13 12:34:36 crc kubenswrapper[4837]: I0313 12:34:36.188975 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="2b3696e623d9d39a462c11ad27c35b22549ca870996c4db75be20036c9201734" exitCode=0 Mar 13 12:34:36 crc kubenswrapper[4837]: I0313 12:34:36.189065 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"2b3696e623d9d39a462c11ad27c35b22549ca870996c4db75be20036c9201734"} Mar 13 12:34:36 crc kubenswrapper[4837]: I0313 12:34:36.189537 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65"} Mar 13 12:34:36 crc kubenswrapper[4837]: I0313 12:34:36.189568 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.163214 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556756-xz7n5"] Mar 13 12:36:00 crc kubenswrapper[4837]: E0313 12:36:00.164293 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41916e77-60c2-4138-b622-003f267ac74e" containerName="oc" Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.164310 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="41916e77-60c2-4138-b622-003f267ac74e" containerName="oc" Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.164589 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="41916e77-60c2-4138-b622-003f267ac74e" containerName="oc" Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.168485 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556756-xz7n5" Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.170796 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.171115 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.172554 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.177524 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556756-xz7n5"] Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.269116 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5qgs\" (UniqueName: \"kubernetes.io/projected/8833ed1c-80bb-4529-9f4a-6109d1a39f13-kube-api-access-b5qgs\") pod \"auto-csr-approver-29556756-xz7n5\" (UID: \"8833ed1c-80bb-4529-9f4a-6109d1a39f13\") " pod="openshift-infra/auto-csr-approver-29556756-xz7n5" Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.370443 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5qgs\" (UniqueName: \"kubernetes.io/projected/8833ed1c-80bb-4529-9f4a-6109d1a39f13-kube-api-access-b5qgs\") pod \"auto-csr-approver-29556756-xz7n5\" (UID: \"8833ed1c-80bb-4529-9f4a-6109d1a39f13\") " pod="openshift-infra/auto-csr-approver-29556756-xz7n5" Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.398429 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5qgs\" (UniqueName: \"kubernetes.io/projected/8833ed1c-80bb-4529-9f4a-6109d1a39f13-kube-api-access-b5qgs\") pod \"auto-csr-approver-29556756-xz7n5\" (UID: \"8833ed1c-80bb-4529-9f4a-6109d1a39f13\") " pod="openshift-infra/auto-csr-approver-29556756-xz7n5" Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.488302 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556756-xz7n5" Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.965309 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556756-xz7n5"] Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.973769 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:36:01 crc kubenswrapper[4837]: I0313 12:36:01.892490 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556756-xz7n5" event={"ID":"8833ed1c-80bb-4529-9f4a-6109d1a39f13","Type":"ContainerStarted","Data":"756d5bc9a4b96d9378d5949b7aa9590b00ec5ed10f86898e6efa807e9d3a455d"} Mar 13 12:36:02 crc kubenswrapper[4837]: I0313 12:36:02.902578 4837 generic.go:334] "Generic (PLEG): container finished" podID="8833ed1c-80bb-4529-9f4a-6109d1a39f13" containerID="8991bbc909e2098b2d6fb047c31dca6c613e8c861798107378538f426d77e480" exitCode=0 Mar 13 12:36:02 crc kubenswrapper[4837]: I0313 12:36:02.902698 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556756-xz7n5" event={"ID":"8833ed1c-80bb-4529-9f4a-6109d1a39f13","Type":"ContainerDied","Data":"8991bbc909e2098b2d6fb047c31dca6c613e8c861798107378538f426d77e480"} Mar 13 12:36:04 crc kubenswrapper[4837]: I0313 12:36:04.370332 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556756-xz7n5" Mar 13 12:36:04 crc kubenswrapper[4837]: I0313 12:36:04.462617 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5qgs\" (UniqueName: \"kubernetes.io/projected/8833ed1c-80bb-4529-9f4a-6109d1a39f13-kube-api-access-b5qgs\") pod \"8833ed1c-80bb-4529-9f4a-6109d1a39f13\" (UID: \"8833ed1c-80bb-4529-9f4a-6109d1a39f13\") " Mar 13 12:36:04 crc kubenswrapper[4837]: I0313 12:36:04.471320 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8833ed1c-80bb-4529-9f4a-6109d1a39f13-kube-api-access-b5qgs" (OuterVolumeSpecName: "kube-api-access-b5qgs") pod "8833ed1c-80bb-4529-9f4a-6109d1a39f13" (UID: "8833ed1c-80bb-4529-9f4a-6109d1a39f13"). InnerVolumeSpecName "kube-api-access-b5qgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:36:04 crc kubenswrapper[4837]: I0313 12:36:04.568179 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5qgs\" (UniqueName: \"kubernetes.io/projected/8833ed1c-80bb-4529-9f4a-6109d1a39f13-kube-api-access-b5qgs\") on node \"crc\" DevicePath \"\"" Mar 13 12:36:04 crc kubenswrapper[4837]: I0313 12:36:04.920740 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556756-xz7n5" event={"ID":"8833ed1c-80bb-4529-9f4a-6109d1a39f13","Type":"ContainerDied","Data":"756d5bc9a4b96d9378d5949b7aa9590b00ec5ed10f86898e6efa807e9d3a455d"} Mar 13 12:36:04 crc kubenswrapper[4837]: I0313 12:36:04.920981 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="756d5bc9a4b96d9378d5949b7aa9590b00ec5ed10f86898e6efa807e9d3a455d" Mar 13 12:36:04 crc kubenswrapper[4837]: I0313 12:36:04.920791 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556756-xz7n5" Mar 13 12:36:05 crc kubenswrapper[4837]: I0313 12:36:05.435599 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556750-xl9g6"] Mar 13 12:36:05 crc kubenswrapper[4837]: I0313 12:36:05.443429 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556750-xl9g6"] Mar 13 12:36:07 crc kubenswrapper[4837]: I0313 12:36:07.100250 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e39a0509-ea55-4b46-a3dc-473bb655cad8" path="/var/lib/kubelet/pods/e39a0509-ea55-4b46-a3dc-473bb655cad8/volumes" Mar 13 12:36:31 crc kubenswrapper[4837]: I0313 12:36:31.093881 4837 scope.go:117] "RemoveContainer" containerID="b4cb982598c9648b581b684b81629524a6916bc2574ae740b552fa7040fb8d2e" Mar 13 12:36:35 crc kubenswrapper[4837]: I0313 12:36:35.483974 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:36:35 crc kubenswrapper[4837]: I0313 12:36:35.484523 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:37:05 crc kubenswrapper[4837]: I0313 12:37:05.484301 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:37:05 crc kubenswrapper[4837]: I0313 12:37:05.484890 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:37:35 crc kubenswrapper[4837]: I0313 12:37:35.483701 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:37:35 crc kubenswrapper[4837]: I0313 12:37:35.484240 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:37:35 crc kubenswrapper[4837]: I0313 12:37:35.484291 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 12:37:35 crc kubenswrapper[4837]: I0313 12:37:35.485125 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:37:35 crc kubenswrapper[4837]: I0313 12:37:35.485186 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" gracePeriod=600 Mar 13 12:37:35 crc kubenswrapper[4837]: E0313 12:37:35.604809 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:37:36 crc kubenswrapper[4837]: I0313 12:37:36.077003 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" exitCode=0 Mar 13 12:37:36 crc kubenswrapper[4837]: I0313 12:37:36.077086 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65"} Mar 13 12:37:36 crc kubenswrapper[4837]: I0313 12:37:36.077541 4837 scope.go:117] "RemoveContainer" containerID="2b3696e623d9d39a462c11ad27c35b22549ca870996c4db75be20036c9201734" Mar 13 12:37:36 crc kubenswrapper[4837]: I0313 12:37:36.078221 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:37:36 crc kubenswrapper[4837]: E0313 12:37:36.078491 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:37:49 crc kubenswrapper[4837]: I0313 12:37:49.048462 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:37:49 crc kubenswrapper[4837]: E0313 12:37:49.049066 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.048470 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:38:00 crc kubenswrapper[4837]: E0313 12:38:00.049240 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.149816 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556758-srcvt"] Mar 13 12:38:00 crc kubenswrapper[4837]: E0313 12:38:00.150304 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8833ed1c-80bb-4529-9f4a-6109d1a39f13" containerName="oc" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.150328 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8833ed1c-80bb-4529-9f4a-6109d1a39f13" containerName="oc" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.150521 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8833ed1c-80bb-4529-9f4a-6109d1a39f13" containerName="oc" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.151152 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556758-srcvt" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.157193 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.157460 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.157729 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.172318 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556758-srcvt"] Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.249378 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sngz\" (UniqueName: \"kubernetes.io/projected/0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38-kube-api-access-4sngz\") pod \"auto-csr-approver-29556758-srcvt\" (UID: \"0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38\") " pod="openshift-infra/auto-csr-approver-29556758-srcvt" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.351436 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sngz\" (UniqueName: \"kubernetes.io/projected/0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38-kube-api-access-4sngz\") pod \"auto-csr-approver-29556758-srcvt\" (UID: \"0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38\") " pod="openshift-infra/auto-csr-approver-29556758-srcvt" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.370252 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sngz\" (UniqueName: \"kubernetes.io/projected/0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38-kube-api-access-4sngz\") pod \"auto-csr-approver-29556758-srcvt\" (UID: \"0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38\") " pod="openshift-infra/auto-csr-approver-29556758-srcvt" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.476445 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556758-srcvt" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.894891 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556758-srcvt"] Mar 13 12:38:01 crc kubenswrapper[4837]: I0313 12:38:01.450370 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556758-srcvt" event={"ID":"0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38","Type":"ContainerStarted","Data":"88ecdaf5d9fd6adb3ae64176b2aaffe17d690c37bef9bf13945ecd3020fa68dd"} Mar 13 12:38:02 crc kubenswrapper[4837]: I0313 12:38:02.460126 4837 generic.go:334] "Generic (PLEG): container finished" podID="0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38" containerID="62169da6c39018c4d64900197bc422e10f99368271388e87ca1a65e2ba0fb126" exitCode=0 Mar 13 12:38:02 crc kubenswrapper[4837]: I0313 12:38:02.460167 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556758-srcvt" event={"ID":"0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38","Type":"ContainerDied","Data":"62169da6c39018c4d64900197bc422e10f99368271388e87ca1a65e2ba0fb126"} Mar 13 12:38:03 crc kubenswrapper[4837]: I0313 12:38:03.886810 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556758-srcvt" Mar 13 12:38:04 crc kubenswrapper[4837]: I0313 12:38:04.026438 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sngz\" (UniqueName: \"kubernetes.io/projected/0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38-kube-api-access-4sngz\") pod \"0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38\" (UID: \"0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38\") " Mar 13 12:38:04 crc kubenswrapper[4837]: I0313 12:38:04.032801 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38-kube-api-access-4sngz" (OuterVolumeSpecName: "kube-api-access-4sngz") pod "0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38" (UID: "0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38"). InnerVolumeSpecName "kube-api-access-4sngz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:38:04 crc kubenswrapper[4837]: I0313 12:38:04.129357 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sngz\" (UniqueName: \"kubernetes.io/projected/0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38-kube-api-access-4sngz\") on node \"crc\" DevicePath \"\"" Mar 13 12:38:04 crc kubenswrapper[4837]: I0313 12:38:04.476791 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556758-srcvt" event={"ID":"0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38","Type":"ContainerDied","Data":"88ecdaf5d9fd6adb3ae64176b2aaffe17d690c37bef9bf13945ecd3020fa68dd"} Mar 13 12:38:04 crc kubenswrapper[4837]: I0313 12:38:04.476845 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88ecdaf5d9fd6adb3ae64176b2aaffe17d690c37bef9bf13945ecd3020fa68dd" Mar 13 12:38:04 crc kubenswrapper[4837]: I0313 12:38:04.476849 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556758-srcvt" Mar 13 12:38:04 crc kubenswrapper[4837]: I0313 12:38:04.954048 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556752-5zvbj"] Mar 13 12:38:04 crc kubenswrapper[4837]: I0313 12:38:04.962440 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556752-5zvbj"] Mar 13 12:38:05 crc kubenswrapper[4837]: I0313 12:38:05.060368 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b8e58c8-d8ed-4773-99bb-6b480514d2b8" path="/var/lib/kubelet/pods/6b8e58c8-d8ed-4773-99bb-6b480514d2b8/volumes" Mar 13 12:38:11 crc kubenswrapper[4837]: I0313 12:38:11.048622 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:38:11 crc kubenswrapper[4837]: E0313 12:38:11.049458 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:38:22 crc kubenswrapper[4837]: I0313 12:38:22.048538 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:38:22 crc kubenswrapper[4837]: E0313 12:38:22.049381 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:38:31 crc kubenswrapper[4837]: I0313 12:38:31.195892 4837 scope.go:117] "RemoveContainer" containerID="17c6f8f105ae4921e05f2c394ef1bdbce7049ed97fc5dda971790fd2b3b77a0d" Mar 13 12:38:35 crc kubenswrapper[4837]: I0313 12:38:35.055997 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:38:35 crc kubenswrapper[4837]: E0313 12:38:35.056829 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:38:46 crc kubenswrapper[4837]: I0313 12:38:46.048447 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:38:46 crc kubenswrapper[4837]: E0313 12:38:46.049471 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:38:58 crc kubenswrapper[4837]: I0313 12:38:58.048138 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:38:58 crc kubenswrapper[4837]: E0313 12:38:58.048753 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:39:12 crc kubenswrapper[4837]: I0313 12:39:12.232311 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:39:12 crc kubenswrapper[4837]: E0313 12:39:12.233234 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:39:24 crc kubenswrapper[4837]: I0313 12:39:24.048868 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:39:24 crc kubenswrapper[4837]: E0313 12:39:24.049796 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:39:35 crc kubenswrapper[4837]: I0313 12:39:35.054149 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:39:35 crc kubenswrapper[4837]: E0313 12:39:35.054977 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:39:47 crc kubenswrapper[4837]: I0313 12:39:47.047924 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:39:47 crc kubenswrapper[4837]: E0313 12:39:47.048837 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:39:59 crc kubenswrapper[4837]: I0313 12:39:59.048897 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:39:59 crc kubenswrapper[4837]: E0313 12:39:59.049691 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.142913 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556760-zcnxn"] Mar 13 12:40:00 crc kubenswrapper[4837]: E0313 12:40:00.143633 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38" containerName="oc" Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.143668 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38" containerName="oc" Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.143838 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38" containerName="oc" Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.144508 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556760-zcnxn" Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.147265 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.147429 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.147743 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.151035 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556760-zcnxn"] Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.211266 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6lfz\" (UniqueName: \"kubernetes.io/projected/a273cb74-6dcc-4e87-8f25-db5c77132250-kube-api-access-r6lfz\") pod \"auto-csr-approver-29556760-zcnxn\" (UID: \"a273cb74-6dcc-4e87-8f25-db5c77132250\") " pod="openshift-infra/auto-csr-approver-29556760-zcnxn" Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.312975 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6lfz\" (UniqueName: \"kubernetes.io/projected/a273cb74-6dcc-4e87-8f25-db5c77132250-kube-api-access-r6lfz\") pod \"auto-csr-approver-29556760-zcnxn\" (UID: \"a273cb74-6dcc-4e87-8f25-db5c77132250\") " pod="openshift-infra/auto-csr-approver-29556760-zcnxn" Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.331805 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6lfz\" (UniqueName: \"kubernetes.io/projected/a273cb74-6dcc-4e87-8f25-db5c77132250-kube-api-access-r6lfz\") pod \"auto-csr-approver-29556760-zcnxn\" (UID: \"a273cb74-6dcc-4e87-8f25-db5c77132250\") " pod="openshift-infra/auto-csr-approver-29556760-zcnxn" Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.497571 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556760-zcnxn" Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.922046 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556760-zcnxn"] Mar 13 12:40:01 crc kubenswrapper[4837]: I0313 12:40:01.548866 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556760-zcnxn" event={"ID":"a273cb74-6dcc-4e87-8f25-db5c77132250","Type":"ContainerStarted","Data":"35a75c11f21fd3aad88d6e5d5ecb767c99ae66bc7ca8f14c0484d9bd5481efb2"} Mar 13 12:40:02 crc kubenswrapper[4837]: I0313 12:40:02.561413 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556760-zcnxn" event={"ID":"a273cb74-6dcc-4e87-8f25-db5c77132250","Type":"ContainerStarted","Data":"da2f9878f57615785241ef1796e14de81a297b17cfd5ebaf3f55711c66c5482b"} Mar 13 12:40:02 crc kubenswrapper[4837]: I0313 12:40:02.578422 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556760-zcnxn" podStartSLOduration=1.279592313 podStartE2EDuration="2.578368828s" podCreationTimestamp="2026-03-13 12:40:00 +0000 UTC" firstStartedPulling="2026-03-13 12:40:00.927726458 +0000 UTC m=+3116.565993221" lastFinishedPulling="2026-03-13 12:40:02.226502973 +0000 UTC m=+3117.864769736" observedRunningTime="2026-03-13 12:40:02.574083383 +0000 UTC m=+3118.212350176" watchObservedRunningTime="2026-03-13 12:40:02.578368828 +0000 UTC m=+3118.216635601" Mar 13 12:40:03 crc kubenswrapper[4837]: I0313 12:40:03.573827 4837 generic.go:334] "Generic (PLEG): container finished" podID="a273cb74-6dcc-4e87-8f25-db5c77132250" containerID="da2f9878f57615785241ef1796e14de81a297b17cfd5ebaf3f55711c66c5482b" exitCode=0 Mar 13 12:40:03 crc kubenswrapper[4837]: I0313 12:40:03.573872 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556760-zcnxn" event={"ID":"a273cb74-6dcc-4e87-8f25-db5c77132250","Type":"ContainerDied","Data":"da2f9878f57615785241ef1796e14de81a297b17cfd5ebaf3f55711c66c5482b"} Mar 13 12:40:04 crc kubenswrapper[4837]: I0313 12:40:04.962723 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556760-zcnxn" Mar 13 12:40:05 crc kubenswrapper[4837]: I0313 12:40:05.121112 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6lfz\" (UniqueName: \"kubernetes.io/projected/a273cb74-6dcc-4e87-8f25-db5c77132250-kube-api-access-r6lfz\") pod \"a273cb74-6dcc-4e87-8f25-db5c77132250\" (UID: \"a273cb74-6dcc-4e87-8f25-db5c77132250\") " Mar 13 12:40:05 crc kubenswrapper[4837]: I0313 12:40:05.127495 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a273cb74-6dcc-4e87-8f25-db5c77132250-kube-api-access-r6lfz" (OuterVolumeSpecName: "kube-api-access-r6lfz") pod "a273cb74-6dcc-4e87-8f25-db5c77132250" (UID: "a273cb74-6dcc-4e87-8f25-db5c77132250"). InnerVolumeSpecName "kube-api-access-r6lfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:40:05 crc kubenswrapper[4837]: I0313 12:40:05.223561 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6lfz\" (UniqueName: \"kubernetes.io/projected/a273cb74-6dcc-4e87-8f25-db5c77132250-kube-api-access-r6lfz\") on node \"crc\" DevicePath \"\"" Mar 13 12:40:05 crc kubenswrapper[4837]: I0313 12:40:05.590883 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556760-zcnxn" event={"ID":"a273cb74-6dcc-4e87-8f25-db5c77132250","Type":"ContainerDied","Data":"35a75c11f21fd3aad88d6e5d5ecb767c99ae66bc7ca8f14c0484d9bd5481efb2"} Mar 13 12:40:05 crc kubenswrapper[4837]: I0313 12:40:05.591180 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35a75c11f21fd3aad88d6e5d5ecb767c99ae66bc7ca8f14c0484d9bd5481efb2" Mar 13 12:40:05 crc kubenswrapper[4837]: I0313 12:40:05.590956 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556760-zcnxn" Mar 13 12:40:05 crc kubenswrapper[4837]: I0313 12:40:05.645931 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556754-kvwpx"] Mar 13 12:40:05 crc kubenswrapper[4837]: I0313 12:40:05.653766 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556754-kvwpx"] Mar 13 12:40:07 crc kubenswrapper[4837]: I0313 12:40:07.058925 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41916e77-60c2-4138-b622-003f267ac74e" path="/var/lib/kubelet/pods/41916e77-60c2-4138-b622-003f267ac74e/volumes" Mar 13 12:40:12 crc kubenswrapper[4837]: I0313 12:40:12.048318 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:40:12 crc kubenswrapper[4837]: E0313 12:40:12.049287 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:40:23 crc kubenswrapper[4837]: I0313 12:40:23.049228 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:40:23 crc kubenswrapper[4837]: E0313 12:40:23.050816 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:40:31 crc kubenswrapper[4837]: I0313 12:40:31.282994 4837 scope.go:117] "RemoveContainer" containerID="d7ac07201d91371d67830ee944bf685a8cc89e1603bec505a0f5a81676127cf5" Mar 13 12:40:35 crc kubenswrapper[4837]: I0313 12:40:35.063969 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:40:35 crc kubenswrapper[4837]: E0313 12:40:35.064718 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:40:47 crc kubenswrapper[4837]: I0313 12:40:47.050045 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:40:47 crc kubenswrapper[4837]: E0313 12:40:47.055310 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:40:58 crc kubenswrapper[4837]: I0313 12:40:58.048263 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:40:58 crc kubenswrapper[4837]: E0313 12:40:58.049048 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.143085 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bdzjd"] Mar 13 12:41:06 crc kubenswrapper[4837]: E0313 12:41:06.144289 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a273cb74-6dcc-4e87-8f25-db5c77132250" containerName="oc" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.144309 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a273cb74-6dcc-4e87-8f25-db5c77132250" containerName="oc" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.144592 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a273cb74-6dcc-4e87-8f25-db5c77132250" containerName="oc" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.146472 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.155113 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bdzjd"] Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.274944 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkv2w\" (UniqueName: \"kubernetes.io/projected/6859fd59-d276-46f7-85ce-3e4a1d934bc0-kube-api-access-rkv2w\") pod \"certified-operators-bdzjd\" (UID: \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\") " pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.275073 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6859fd59-d276-46f7-85ce-3e4a1d934bc0-catalog-content\") pod \"certified-operators-bdzjd\" (UID: \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\") " pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.275118 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6859fd59-d276-46f7-85ce-3e4a1d934bc0-utilities\") pod \"certified-operators-bdzjd\" (UID: \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\") " pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.377938 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkv2w\" (UniqueName: \"kubernetes.io/projected/6859fd59-d276-46f7-85ce-3e4a1d934bc0-kube-api-access-rkv2w\") pod \"certified-operators-bdzjd\" (UID: \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\") " pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.378073 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6859fd59-d276-46f7-85ce-3e4a1d934bc0-catalog-content\") pod \"certified-operators-bdzjd\" (UID: \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\") " pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.378120 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6859fd59-d276-46f7-85ce-3e4a1d934bc0-utilities\") pod \"certified-operators-bdzjd\" (UID: \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\") " pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.378732 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6859fd59-d276-46f7-85ce-3e4a1d934bc0-utilities\") pod \"certified-operators-bdzjd\" (UID: \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\") " pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.378746 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6859fd59-d276-46f7-85ce-3e4a1d934bc0-catalog-content\") pod \"certified-operators-bdzjd\" (UID: \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\") " pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.396972 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkv2w\" (UniqueName: \"kubernetes.io/projected/6859fd59-d276-46f7-85ce-3e4a1d934bc0-kube-api-access-rkv2w\") pod \"certified-operators-bdzjd\" (UID: \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\") " pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.464961 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.971050 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bdzjd"] Mar 13 12:41:07 crc kubenswrapper[4837]: I0313 12:41:07.507450 4837 generic.go:334] "Generic (PLEG): container finished" podID="6859fd59-d276-46f7-85ce-3e4a1d934bc0" containerID="560f57b3491ce1ee56c3ba3082a4da29491c357f456b4c3fd4fe07ad4ec44958" exitCode=0 Mar 13 12:41:07 crc kubenswrapper[4837]: I0313 12:41:07.507530 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdzjd" event={"ID":"6859fd59-d276-46f7-85ce-3e4a1d934bc0","Type":"ContainerDied","Data":"560f57b3491ce1ee56c3ba3082a4da29491c357f456b4c3fd4fe07ad4ec44958"} Mar 13 12:41:07 crc kubenswrapper[4837]: I0313 12:41:07.507771 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdzjd" event={"ID":"6859fd59-d276-46f7-85ce-3e4a1d934bc0","Type":"ContainerStarted","Data":"70fa120127e07316b3b39c8800b91193ea0056418fb3accc2bf31bf0968af42a"} Mar 13 12:41:07 crc kubenswrapper[4837]: I0313 12:41:07.509598 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:41:09 crc kubenswrapper[4837]: I0313 12:41:09.528542 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdzjd" event={"ID":"6859fd59-d276-46f7-85ce-3e4a1d934bc0","Type":"ContainerStarted","Data":"73dc7e0603990c2da015458d2bed2e5aee4a0bf61d78fa122e11168fdabe881b"} Mar 13 12:41:10 crc kubenswrapper[4837]: I0313 12:41:10.048802 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:41:10 crc kubenswrapper[4837]: E0313 12:41:10.049158 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:41:11 crc kubenswrapper[4837]: I0313 12:41:11.550167 4837 generic.go:334] "Generic (PLEG): container finished" podID="6859fd59-d276-46f7-85ce-3e4a1d934bc0" containerID="73dc7e0603990c2da015458d2bed2e5aee4a0bf61d78fa122e11168fdabe881b" exitCode=0 Mar 13 12:41:11 crc kubenswrapper[4837]: I0313 12:41:11.550266 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdzjd" event={"ID":"6859fd59-d276-46f7-85ce-3e4a1d934bc0","Type":"ContainerDied","Data":"73dc7e0603990c2da015458d2bed2e5aee4a0bf61d78fa122e11168fdabe881b"} Mar 13 12:41:13 crc kubenswrapper[4837]: I0313 12:41:13.581520 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdzjd" event={"ID":"6859fd59-d276-46f7-85ce-3e4a1d934bc0","Type":"ContainerStarted","Data":"18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137"} Mar 13 12:41:13 crc kubenswrapper[4837]: I0313 12:41:13.606436 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bdzjd" podStartSLOduration=2.53221213 podStartE2EDuration="7.606412647s" podCreationTimestamp="2026-03-13 12:41:06 +0000 UTC" firstStartedPulling="2026-03-13 12:41:07.509331939 +0000 UTC m=+3183.147598712" lastFinishedPulling="2026-03-13 12:41:12.583532476 +0000 UTC m=+3188.221799229" observedRunningTime="2026-03-13 12:41:13.605321993 +0000 UTC m=+3189.243588776" watchObservedRunningTime="2026-03-13 12:41:13.606412647 +0000 UTC m=+3189.244679420" Mar 13 12:41:16 crc kubenswrapper[4837]: I0313 12:41:16.465103 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:16 crc kubenswrapper[4837]: I0313 12:41:16.465409 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:16 crc kubenswrapper[4837]: I0313 12:41:16.510934 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:21 crc kubenswrapper[4837]: I0313 12:41:21.671742 4837 generic.go:334] "Generic (PLEG): container finished" podID="66bdda91-c5b6-4879-9adf-21846884c797" containerID="bda10fa8fd12669f2f471650132835bc9a8231ba850dd11df31ebbad360b9cf6" exitCode=0 Mar 13 12:41:21 crc kubenswrapper[4837]: I0313 12:41:21.671875 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"66bdda91-c5b6-4879-9adf-21846884c797","Type":"ContainerDied","Data":"bda10fa8fd12669f2f471650132835bc9a8231ba850dd11df31ebbad360b9cf6"} Mar 13 12:41:22 crc kubenswrapper[4837]: I0313 12:41:22.048875 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:41:22 crc kubenswrapper[4837]: E0313 12:41:22.049188 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.108483 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.259245 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-openstack-config-secret\") pod \"66bdda91-c5b6-4879-9adf-21846884c797\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.259314 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66bdda91-c5b6-4879-9adf-21846884c797-openstack-config\") pod \"66bdda91-c5b6-4879-9adf-21846884c797\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.259337 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66bdda91-c5b6-4879-9adf-21846884c797-config-data\") pod \"66bdda91-c5b6-4879-9adf-21846884c797\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.259364 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"66bdda91-c5b6-4879-9adf-21846884c797\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.259512 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/66bdda91-c5b6-4879-9adf-21846884c797-test-operator-ephemeral-temporary\") pod \"66bdda91-c5b6-4879-9adf-21846884c797\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.259549 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-ssh-key\") pod \"66bdda91-c5b6-4879-9adf-21846884c797\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.259672 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/66bdda91-c5b6-4879-9adf-21846884c797-test-operator-ephemeral-workdir\") pod \"66bdda91-c5b6-4879-9adf-21846884c797\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.259699 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdksm\" (UniqueName: \"kubernetes.io/projected/66bdda91-c5b6-4879-9adf-21846884c797-kube-api-access-zdksm\") pod \"66bdda91-c5b6-4879-9adf-21846884c797\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.259728 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-ca-certs\") pod \"66bdda91-c5b6-4879-9adf-21846884c797\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.260886 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66bdda91-c5b6-4879-9adf-21846884c797-config-data" (OuterVolumeSpecName: "config-data") pod "66bdda91-c5b6-4879-9adf-21846884c797" (UID: "66bdda91-c5b6-4879-9adf-21846884c797"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.261361 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66bdda91-c5b6-4879-9adf-21846884c797-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "66bdda91-c5b6-4879-9adf-21846884c797" (UID: "66bdda91-c5b6-4879-9adf-21846884c797"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.266653 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66bdda91-c5b6-4879-9adf-21846884c797-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "66bdda91-c5b6-4879-9adf-21846884c797" (UID: "66bdda91-c5b6-4879-9adf-21846884c797"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.266805 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66bdda91-c5b6-4879-9adf-21846884c797-kube-api-access-zdksm" (OuterVolumeSpecName: "kube-api-access-zdksm") pod "66bdda91-c5b6-4879-9adf-21846884c797" (UID: "66bdda91-c5b6-4879-9adf-21846884c797"). InnerVolumeSpecName "kube-api-access-zdksm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.266904 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "66bdda91-c5b6-4879-9adf-21846884c797" (UID: "66bdda91-c5b6-4879-9adf-21846884c797"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.305195 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "66bdda91-c5b6-4879-9adf-21846884c797" (UID: "66bdda91-c5b6-4879-9adf-21846884c797"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.306970 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "66bdda91-c5b6-4879-9adf-21846884c797" (UID: "66bdda91-c5b6-4879-9adf-21846884c797"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.310920 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66bdda91-c5b6-4879-9adf-21846884c797-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "66bdda91-c5b6-4879-9adf-21846884c797" (UID: "66bdda91-c5b6-4879-9adf-21846884c797"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.322076 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "66bdda91-c5b6-4879-9adf-21846884c797" (UID: "66bdda91-c5b6-4879-9adf-21846884c797"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.362450 4837 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/66bdda91-c5b6-4879-9adf-21846884c797-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.362482 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.362492 4837 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/66bdda91-c5b6-4879-9adf-21846884c797-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.362502 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdksm\" (UniqueName: \"kubernetes.io/projected/66bdda91-c5b6-4879-9adf-21846884c797-kube-api-access-zdksm\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.362511 4837 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.362519 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.362528 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66bdda91-c5b6-4879-9adf-21846884c797-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.362536 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66bdda91-c5b6-4879-9adf-21846884c797-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.362567 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.381959 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.464357 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.691431 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"66bdda91-c5b6-4879-9adf-21846884c797","Type":"ContainerDied","Data":"e0109150fdc9bce6fc2a2f4d23a6692ef997ab608b50f1cec0fb2562f9d86611"} Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.691489 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0109150fdc9bce6fc2a2f4d23a6692ef997ab608b50f1cec0fb2562f9d86611" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.691531 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 13 12:41:26 crc kubenswrapper[4837]: I0313 12:41:26.515274 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:26 crc kubenswrapper[4837]: I0313 12:41:26.580966 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bdzjd"] Mar 13 12:41:26 crc kubenswrapper[4837]: I0313 12:41:26.723953 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bdzjd" podUID="6859fd59-d276-46f7-85ce-3e4a1d934bc0" containerName="registry-server" containerID="cri-o://18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137" gracePeriod=2 Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.150403 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.251330 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkv2w\" (UniqueName: \"kubernetes.io/projected/6859fd59-d276-46f7-85ce-3e4a1d934bc0-kube-api-access-rkv2w\") pod \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\" (UID: \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\") " Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.251435 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6859fd59-d276-46f7-85ce-3e4a1d934bc0-catalog-content\") pod \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\" (UID: \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\") " Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.251467 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6859fd59-d276-46f7-85ce-3e4a1d934bc0-utilities\") pod \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\" (UID: \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\") " Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.252667 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6859fd59-d276-46f7-85ce-3e4a1d934bc0-utilities" (OuterVolumeSpecName: "utilities") pod "6859fd59-d276-46f7-85ce-3e4a1d934bc0" (UID: "6859fd59-d276-46f7-85ce-3e4a1d934bc0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.261122 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6859fd59-d276-46f7-85ce-3e4a1d934bc0-kube-api-access-rkv2w" (OuterVolumeSpecName: "kube-api-access-rkv2w") pod "6859fd59-d276-46f7-85ce-3e4a1d934bc0" (UID: "6859fd59-d276-46f7-85ce-3e4a1d934bc0"). InnerVolumeSpecName "kube-api-access-rkv2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.328959 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6859fd59-d276-46f7-85ce-3e4a1d934bc0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6859fd59-d276-46f7-85ce-3e4a1d934bc0" (UID: "6859fd59-d276-46f7-85ce-3e4a1d934bc0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.353158 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkv2w\" (UniqueName: \"kubernetes.io/projected/6859fd59-d276-46f7-85ce-3e4a1d934bc0-kube-api-access-rkv2w\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.353197 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6859fd59-d276-46f7-85ce-3e4a1d934bc0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.353210 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6859fd59-d276-46f7-85ce-3e4a1d934bc0-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.622501 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 13 12:41:27 crc kubenswrapper[4837]: E0313 12:41:27.623074 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6859fd59-d276-46f7-85ce-3e4a1d934bc0" containerName="extract-content" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.623095 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6859fd59-d276-46f7-85ce-3e4a1d934bc0" containerName="extract-content" Mar 13 12:41:27 crc kubenswrapper[4837]: E0313 12:41:27.623122 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6859fd59-d276-46f7-85ce-3e4a1d934bc0" containerName="registry-server" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.623131 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6859fd59-d276-46f7-85ce-3e4a1d934bc0" containerName="registry-server" Mar 13 12:41:27 crc kubenswrapper[4837]: E0313 12:41:27.623150 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66bdda91-c5b6-4879-9adf-21846884c797" containerName="tempest-tests-tempest-tests-runner" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.623159 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="66bdda91-c5b6-4879-9adf-21846884c797" containerName="tempest-tests-tempest-tests-runner" Mar 13 12:41:27 crc kubenswrapper[4837]: E0313 12:41:27.623179 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6859fd59-d276-46f7-85ce-3e4a1d934bc0" containerName="extract-utilities" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.623187 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6859fd59-d276-46f7-85ce-3e4a1d934bc0" containerName="extract-utilities" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.623429 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6859fd59-d276-46f7-85ce-3e4a1d934bc0" containerName="registry-server" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.623468 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="66bdda91-c5b6-4879-9adf-21846884c797" containerName="tempest-tests-tempest-tests-runner" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.624252 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.626731 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bvdx7" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.630392 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.657241 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0244acef-b630-4b97-9bb5-9f99de391613\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.657548 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4xwf\" (UniqueName: \"kubernetes.io/projected/0244acef-b630-4b97-9bb5-9f99de391613-kube-api-access-g4xwf\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0244acef-b630-4b97-9bb5-9f99de391613\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.737314 4837 generic.go:334] "Generic (PLEG): container finished" podID="6859fd59-d276-46f7-85ce-3e4a1d934bc0" containerID="18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137" exitCode=0 Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.737351 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdzjd" event={"ID":"6859fd59-d276-46f7-85ce-3e4a1d934bc0","Type":"ContainerDied","Data":"18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137"} Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.737406 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdzjd" event={"ID":"6859fd59-d276-46f7-85ce-3e4a1d934bc0","Type":"ContainerDied","Data":"70fa120127e07316b3b39c8800b91193ea0056418fb3accc2bf31bf0968af42a"} Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.737418 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.737435 4837 scope.go:117] "RemoveContainer" containerID="18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.758932 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0244acef-b630-4b97-9bb5-9f99de391613\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.759021 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4xwf\" (UniqueName: \"kubernetes.io/projected/0244acef-b630-4b97-9bb5-9f99de391613-kube-api-access-g4xwf\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0244acef-b630-4b97-9bb5-9f99de391613\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.760452 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0244acef-b630-4b97-9bb5-9f99de391613\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.776211 4837 scope.go:117] "RemoveContainer" containerID="73dc7e0603990c2da015458d2bed2e5aee4a0bf61d78fa122e11168fdabe881b" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.790522 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4xwf\" (UniqueName: \"kubernetes.io/projected/0244acef-b630-4b97-9bb5-9f99de391613-kube-api-access-g4xwf\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0244acef-b630-4b97-9bb5-9f99de391613\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.797448 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bdzjd"] Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.808511 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0244acef-b630-4b97-9bb5-9f99de391613\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.809687 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bdzjd"] Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.847956 4837 scope.go:117] "RemoveContainer" containerID="560f57b3491ce1ee56c3ba3082a4da29491c357f456b4c3fd4fe07ad4ec44958" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.917320 4837 scope.go:117] "RemoveContainer" containerID="18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137" Mar 13 12:41:27 crc kubenswrapper[4837]: E0313 12:41:27.921387 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137\": container with ID starting with 18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137 not found: ID does not exist" containerID="18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.921457 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137"} err="failed to get container status \"18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137\": rpc error: code = NotFound desc = could not find container \"18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137\": container with ID starting with 18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137 not found: ID does not exist" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.921483 4837 scope.go:117] "RemoveContainer" containerID="73dc7e0603990c2da015458d2bed2e5aee4a0bf61d78fa122e11168fdabe881b" Mar 13 12:41:27 crc kubenswrapper[4837]: E0313 12:41:27.922395 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73dc7e0603990c2da015458d2bed2e5aee4a0bf61d78fa122e11168fdabe881b\": container with ID starting with 73dc7e0603990c2da015458d2bed2e5aee4a0bf61d78fa122e11168fdabe881b not found: ID does not exist" containerID="73dc7e0603990c2da015458d2bed2e5aee4a0bf61d78fa122e11168fdabe881b" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.922420 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73dc7e0603990c2da015458d2bed2e5aee4a0bf61d78fa122e11168fdabe881b"} err="failed to get container status \"73dc7e0603990c2da015458d2bed2e5aee4a0bf61d78fa122e11168fdabe881b\": rpc error: code = NotFound desc = could not find container \"73dc7e0603990c2da015458d2bed2e5aee4a0bf61d78fa122e11168fdabe881b\": container with ID starting with 73dc7e0603990c2da015458d2bed2e5aee4a0bf61d78fa122e11168fdabe881b not found: ID does not exist" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.922434 4837 scope.go:117] "RemoveContainer" containerID="560f57b3491ce1ee56c3ba3082a4da29491c357f456b4c3fd4fe07ad4ec44958" Mar 13 12:41:27 crc kubenswrapper[4837]: E0313 12:41:27.922731 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"560f57b3491ce1ee56c3ba3082a4da29491c357f456b4c3fd4fe07ad4ec44958\": container with ID starting with 560f57b3491ce1ee56c3ba3082a4da29491c357f456b4c3fd4fe07ad4ec44958 not found: ID does not exist" containerID="560f57b3491ce1ee56c3ba3082a4da29491c357f456b4c3fd4fe07ad4ec44958" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.922765 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"560f57b3491ce1ee56c3ba3082a4da29491c357f456b4c3fd4fe07ad4ec44958"} err="failed to get container status \"560f57b3491ce1ee56c3ba3082a4da29491c357f456b4c3fd4fe07ad4ec44958\": rpc error: code = NotFound desc = could not find container \"560f57b3491ce1ee56c3ba3082a4da29491c357f456b4c3fd4fe07ad4ec44958\": container with ID starting with 560f57b3491ce1ee56c3ba3082a4da29491c357f456b4c3fd4fe07ad4ec44958 not found: ID does not exist" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.940332 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 12:41:28 crc kubenswrapper[4837]: I0313 12:41:28.388096 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 13 12:41:28 crc kubenswrapper[4837]: I0313 12:41:28.746237 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"0244acef-b630-4b97-9bb5-9f99de391613","Type":"ContainerStarted","Data":"bc32204be51f88f37836971c45cfffe3d3563a242517dbcf9f55f53e20d96bf0"} Mar 13 12:41:29 crc kubenswrapper[4837]: I0313 12:41:29.058338 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6859fd59-d276-46f7-85ce-3e4a1d934bc0" path="/var/lib/kubelet/pods/6859fd59-d276-46f7-85ce-3e4a1d934bc0/volumes" Mar 13 12:41:29 crc kubenswrapper[4837]: I0313 12:41:29.760106 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"0244acef-b630-4b97-9bb5-9f99de391613","Type":"ContainerStarted","Data":"8276c52e1e7b4f82cdb660276ad4c0dc37d71176853a44bf73d49eadf6bf1474"} Mar 13 12:41:29 crc kubenswrapper[4837]: I0313 12:41:29.782478 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.028978374 podStartE2EDuration="2.782452487s" podCreationTimestamp="2026-03-13 12:41:27 +0000 UTC" firstStartedPulling="2026-03-13 12:41:28.392491689 +0000 UTC m=+3204.030758452" lastFinishedPulling="2026-03-13 12:41:29.145965802 +0000 UTC m=+3204.784232565" observedRunningTime="2026-03-13 12:41:29.779003928 +0000 UTC m=+3205.417270701" watchObservedRunningTime="2026-03-13 12:41:29.782452487 +0000 UTC m=+3205.420719260" Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.170990 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qxmkr"] Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.173038 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.179099 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qxmkr"] Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.306604 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7daa3751-d057-474f-9a0f-79fdada329a2-utilities\") pod \"redhat-operators-qxmkr\" (UID: \"7daa3751-d057-474f-9a0f-79fdada329a2\") " pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.306653 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7daa3751-d057-474f-9a0f-79fdada329a2-catalog-content\") pod \"redhat-operators-qxmkr\" (UID: \"7daa3751-d057-474f-9a0f-79fdada329a2\") " pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.306956 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz5c2\" (UniqueName: \"kubernetes.io/projected/7daa3751-d057-474f-9a0f-79fdada329a2-kube-api-access-mz5c2\") pod \"redhat-operators-qxmkr\" (UID: \"7daa3751-d057-474f-9a0f-79fdada329a2\") " pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.408579 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz5c2\" (UniqueName: \"kubernetes.io/projected/7daa3751-d057-474f-9a0f-79fdada329a2-kube-api-access-mz5c2\") pod \"redhat-operators-qxmkr\" (UID: \"7daa3751-d057-474f-9a0f-79fdada329a2\") " pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.408722 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7daa3751-d057-474f-9a0f-79fdada329a2-utilities\") pod \"redhat-operators-qxmkr\" (UID: \"7daa3751-d057-474f-9a0f-79fdada329a2\") " pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.408743 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7daa3751-d057-474f-9a0f-79fdada329a2-catalog-content\") pod \"redhat-operators-qxmkr\" (UID: \"7daa3751-d057-474f-9a0f-79fdada329a2\") " pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.409254 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7daa3751-d057-474f-9a0f-79fdada329a2-catalog-content\") pod \"redhat-operators-qxmkr\" (UID: \"7daa3751-d057-474f-9a0f-79fdada329a2\") " pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.409397 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7daa3751-d057-474f-9a0f-79fdada329a2-utilities\") pod \"redhat-operators-qxmkr\" (UID: \"7daa3751-d057-474f-9a0f-79fdada329a2\") " pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.429240 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz5c2\" (UniqueName: \"kubernetes.io/projected/7daa3751-d057-474f-9a0f-79fdada329a2-kube-api-access-mz5c2\") pod \"redhat-operators-qxmkr\" (UID: \"7daa3751-d057-474f-9a0f-79fdada329a2\") " pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.493461 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:31 crc kubenswrapper[4837]: W0313 12:41:31.016206 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7daa3751_d057_474f_9a0f_79fdada329a2.slice/crio-d198cd5e4de064380ab84dfd25c9bf6271c0c2b325c9d7fcdbcadff172704d51 WatchSource:0}: Error finding container d198cd5e4de064380ab84dfd25c9bf6271c0c2b325c9d7fcdbcadff172704d51: Status 404 returned error can't find the container with id d198cd5e4de064380ab84dfd25c9bf6271c0c2b325c9d7fcdbcadff172704d51 Mar 13 12:41:31 crc kubenswrapper[4837]: I0313 12:41:31.020974 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qxmkr"] Mar 13 12:41:31 crc kubenswrapper[4837]: I0313 12:41:31.779479 4837 generic.go:334] "Generic (PLEG): container finished" podID="7daa3751-d057-474f-9a0f-79fdada329a2" containerID="1b2c7f9c7376794e5560f9e528d8a765c014e6561edefe605eb3c2dd6333a2a0" exitCode=0 Mar 13 12:41:31 crc kubenswrapper[4837]: I0313 12:41:31.779604 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxmkr" event={"ID":"7daa3751-d057-474f-9a0f-79fdada329a2","Type":"ContainerDied","Data":"1b2c7f9c7376794e5560f9e528d8a765c014e6561edefe605eb3c2dd6333a2a0"} Mar 13 12:41:31 crc kubenswrapper[4837]: I0313 12:41:31.780138 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxmkr" event={"ID":"7daa3751-d057-474f-9a0f-79fdada329a2","Type":"ContainerStarted","Data":"d198cd5e4de064380ab84dfd25c9bf6271c0c2b325c9d7fcdbcadff172704d51"} Mar 13 12:41:32 crc kubenswrapper[4837]: I0313 12:41:32.790729 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxmkr" event={"ID":"7daa3751-d057-474f-9a0f-79fdada329a2","Type":"ContainerStarted","Data":"75c1c0316d400d5f2fc848d84d2ba2bd6d6be3266e3bc1c32dfc91a9178306cc"} Mar 13 12:41:34 crc kubenswrapper[4837]: I0313 12:41:34.048409 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:41:34 crc kubenswrapper[4837]: E0313 12:41:34.048964 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:41:37 crc kubenswrapper[4837]: I0313 12:41:37.856910 4837 generic.go:334] "Generic (PLEG): container finished" podID="7daa3751-d057-474f-9a0f-79fdada329a2" containerID="75c1c0316d400d5f2fc848d84d2ba2bd6d6be3266e3bc1c32dfc91a9178306cc" exitCode=0 Mar 13 12:41:37 crc kubenswrapper[4837]: I0313 12:41:37.857007 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxmkr" event={"ID":"7daa3751-d057-474f-9a0f-79fdada329a2","Type":"ContainerDied","Data":"75c1c0316d400d5f2fc848d84d2ba2bd6d6be3266e3bc1c32dfc91a9178306cc"} Mar 13 12:41:38 crc kubenswrapper[4837]: I0313 12:41:38.871854 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxmkr" event={"ID":"7daa3751-d057-474f-9a0f-79fdada329a2","Type":"ContainerStarted","Data":"d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f"} Mar 13 12:41:38 crc kubenswrapper[4837]: I0313 12:41:38.899011 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qxmkr" podStartSLOduration=2.337802442 podStartE2EDuration="8.898987945s" podCreationTimestamp="2026-03-13 12:41:30 +0000 UTC" firstStartedPulling="2026-03-13 12:41:31.781313117 +0000 UTC m=+3207.419579880" lastFinishedPulling="2026-03-13 12:41:38.34249862 +0000 UTC m=+3213.980765383" observedRunningTime="2026-03-13 12:41:38.894244426 +0000 UTC m=+3214.532511189" watchObservedRunningTime="2026-03-13 12:41:38.898987945 +0000 UTC m=+3214.537254708" Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.314148 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xpq8f"] Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.316713 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.336726 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xpq8f"] Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.483573 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-utilities\") pod \"redhat-marketplace-xpq8f\" (UID: \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\") " pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.483889 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5gjp\" (UniqueName: \"kubernetes.io/projected/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-kube-api-access-q5gjp\") pod \"redhat-marketplace-xpq8f\" (UID: \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\") " pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.484305 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-catalog-content\") pod \"redhat-marketplace-xpq8f\" (UID: \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\") " pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.585839 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5gjp\" (UniqueName: \"kubernetes.io/projected/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-kube-api-access-q5gjp\") pod \"redhat-marketplace-xpq8f\" (UID: \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\") " pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.586200 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-catalog-content\") pod \"redhat-marketplace-xpq8f\" (UID: \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\") " pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.586326 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-utilities\") pod \"redhat-marketplace-xpq8f\" (UID: \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\") " pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.586703 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-catalog-content\") pod \"redhat-marketplace-xpq8f\" (UID: \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\") " pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.586819 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-utilities\") pod \"redhat-marketplace-xpq8f\" (UID: \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\") " pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.609696 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5gjp\" (UniqueName: \"kubernetes.io/projected/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-kube-api-access-q5gjp\") pod \"redhat-marketplace-xpq8f\" (UID: \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\") " pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.646107 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:40 crc kubenswrapper[4837]: I0313 12:41:40.115716 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xpq8f"] Mar 13 12:41:40 crc kubenswrapper[4837]: W0313 12:41:40.122900 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7db90d1b_c7eb_4de2_8783_417fd25bdc6f.slice/crio-af64dc414a6b7437b27100700bc03b998a48953755354439146e81c31e764a10 WatchSource:0}: Error finding container af64dc414a6b7437b27100700bc03b998a48953755354439146e81c31e764a10: Status 404 returned error can't find the container with id af64dc414a6b7437b27100700bc03b998a48953755354439146e81c31e764a10 Mar 13 12:41:40 crc kubenswrapper[4837]: I0313 12:41:40.494624 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:40 crc kubenswrapper[4837]: I0313 12:41:40.494700 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:40 crc kubenswrapper[4837]: I0313 12:41:40.891614 4837 generic.go:334] "Generic (PLEG): container finished" podID="7db90d1b-c7eb-4de2-8783-417fd25bdc6f" containerID="673af5193713526abe23d3ad06717349574812d657dea12723d9f30f91f6c7a3" exitCode=0 Mar 13 12:41:40 crc kubenswrapper[4837]: I0313 12:41:40.891677 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xpq8f" event={"ID":"7db90d1b-c7eb-4de2-8783-417fd25bdc6f","Type":"ContainerDied","Data":"673af5193713526abe23d3ad06717349574812d657dea12723d9f30f91f6c7a3"} Mar 13 12:41:40 crc kubenswrapper[4837]: I0313 12:41:40.891982 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xpq8f" event={"ID":"7db90d1b-c7eb-4de2-8783-417fd25bdc6f","Type":"ContainerStarted","Data":"af64dc414a6b7437b27100700bc03b998a48953755354439146e81c31e764a10"} Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.551381 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qxmkr" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" containerName="registry-server" probeResult="failure" output=< Mar 13 12:41:41 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Mar 13 12:41:41 crc kubenswrapper[4837]: > Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.702995 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wdpnx"] Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.706477 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.714400 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wdpnx"] Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.832227 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75dcf43e-e9a3-4956-9582-9663efc7a07b-utilities\") pod \"community-operators-wdpnx\" (UID: \"75dcf43e-e9a3-4956-9582-9663efc7a07b\") " pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.832296 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cmm5\" (UniqueName: \"kubernetes.io/projected/75dcf43e-e9a3-4956-9582-9663efc7a07b-kube-api-access-8cmm5\") pod \"community-operators-wdpnx\" (UID: \"75dcf43e-e9a3-4956-9582-9663efc7a07b\") " pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.832682 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75dcf43e-e9a3-4956-9582-9663efc7a07b-catalog-content\") pod \"community-operators-wdpnx\" (UID: \"75dcf43e-e9a3-4956-9582-9663efc7a07b\") " pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.905001 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xpq8f" event={"ID":"7db90d1b-c7eb-4de2-8783-417fd25bdc6f","Type":"ContainerStarted","Data":"0275010ad2cdf8863266a601f875f06c0f5d019bb4aa2ea17b90064fa41c4bf5"} Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.934535 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75dcf43e-e9a3-4956-9582-9663efc7a07b-catalog-content\") pod \"community-operators-wdpnx\" (UID: \"75dcf43e-e9a3-4956-9582-9663efc7a07b\") " pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.934941 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75dcf43e-e9a3-4956-9582-9663efc7a07b-utilities\") pod \"community-operators-wdpnx\" (UID: \"75dcf43e-e9a3-4956-9582-9663efc7a07b\") " pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.935064 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cmm5\" (UniqueName: \"kubernetes.io/projected/75dcf43e-e9a3-4956-9582-9663efc7a07b-kube-api-access-8cmm5\") pod \"community-operators-wdpnx\" (UID: \"75dcf43e-e9a3-4956-9582-9663efc7a07b\") " pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.935387 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75dcf43e-e9a3-4956-9582-9663efc7a07b-utilities\") pod \"community-operators-wdpnx\" (UID: \"75dcf43e-e9a3-4956-9582-9663efc7a07b\") " pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.935498 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75dcf43e-e9a3-4956-9582-9663efc7a07b-catalog-content\") pod \"community-operators-wdpnx\" (UID: \"75dcf43e-e9a3-4956-9582-9663efc7a07b\") " pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.957849 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cmm5\" (UniqueName: \"kubernetes.io/projected/75dcf43e-e9a3-4956-9582-9663efc7a07b-kube-api-access-8cmm5\") pod \"community-operators-wdpnx\" (UID: \"75dcf43e-e9a3-4956-9582-9663efc7a07b\") " pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:42 crc kubenswrapper[4837]: I0313 12:41:42.042373 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:42 crc kubenswrapper[4837]: I0313 12:41:42.600965 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wdpnx"] Mar 13 12:41:42 crc kubenswrapper[4837]: I0313 12:41:42.914693 4837 generic.go:334] "Generic (PLEG): container finished" podID="7db90d1b-c7eb-4de2-8783-417fd25bdc6f" containerID="0275010ad2cdf8863266a601f875f06c0f5d019bb4aa2ea17b90064fa41c4bf5" exitCode=0 Mar 13 12:41:42 crc kubenswrapper[4837]: I0313 12:41:42.914990 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xpq8f" event={"ID":"7db90d1b-c7eb-4de2-8783-417fd25bdc6f","Type":"ContainerDied","Data":"0275010ad2cdf8863266a601f875f06c0f5d019bb4aa2ea17b90064fa41c4bf5"} Mar 13 12:41:42 crc kubenswrapper[4837]: I0313 12:41:42.917357 4837 generic.go:334] "Generic (PLEG): container finished" podID="75dcf43e-e9a3-4956-9582-9663efc7a07b" containerID="1cdd21f387fca0b63738dbf1a676060e6b1a2034a411336f47b42f6f70b348d1" exitCode=0 Mar 13 12:41:42 crc kubenswrapper[4837]: I0313 12:41:42.917414 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdpnx" event={"ID":"75dcf43e-e9a3-4956-9582-9663efc7a07b","Type":"ContainerDied","Data":"1cdd21f387fca0b63738dbf1a676060e6b1a2034a411336f47b42f6f70b348d1"} Mar 13 12:41:42 crc kubenswrapper[4837]: I0313 12:41:42.917458 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdpnx" event={"ID":"75dcf43e-e9a3-4956-9582-9663efc7a07b","Type":"ContainerStarted","Data":"e5e7a4dd77c0df97d0b938b4b0928d9a15ca84c095db93e2aee641e194eec7e0"} Mar 13 12:41:43 crc kubenswrapper[4837]: I0313 12:41:43.931904 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xpq8f" event={"ID":"7db90d1b-c7eb-4de2-8783-417fd25bdc6f","Type":"ContainerStarted","Data":"13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff"} Mar 13 12:41:43 crc kubenswrapper[4837]: I0313 12:41:43.937227 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdpnx" event={"ID":"75dcf43e-e9a3-4956-9582-9663efc7a07b","Type":"ContainerStarted","Data":"6f8327024dd21bd081ba023805b2e005959158663382bf7c861522d50eaf9255"} Mar 13 12:41:43 crc kubenswrapper[4837]: I0313 12:41:43.979138 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xpq8f" podStartSLOduration=2.560836048 podStartE2EDuration="4.979120539s" podCreationTimestamp="2026-03-13 12:41:39 +0000 UTC" firstStartedPulling="2026-03-13 12:41:40.893261021 +0000 UTC m=+3216.531527784" lastFinishedPulling="2026-03-13 12:41:43.311545512 +0000 UTC m=+3218.949812275" observedRunningTime="2026-03-13 12:41:43.956443362 +0000 UTC m=+3219.594710125" watchObservedRunningTime="2026-03-13 12:41:43.979120539 +0000 UTC m=+3219.617387302" Mar 13 12:41:46 crc kubenswrapper[4837]: I0313 12:41:46.979813 4837 generic.go:334] "Generic (PLEG): container finished" podID="75dcf43e-e9a3-4956-9582-9663efc7a07b" containerID="6f8327024dd21bd081ba023805b2e005959158663382bf7c861522d50eaf9255" exitCode=0 Mar 13 12:41:46 crc kubenswrapper[4837]: I0313 12:41:46.979877 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdpnx" event={"ID":"75dcf43e-e9a3-4956-9582-9663efc7a07b","Type":"ContainerDied","Data":"6f8327024dd21bd081ba023805b2e005959158663382bf7c861522d50eaf9255"} Mar 13 12:41:47 crc kubenswrapper[4837]: I0313 12:41:47.992774 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdpnx" event={"ID":"75dcf43e-e9a3-4956-9582-9663efc7a07b","Type":"ContainerStarted","Data":"491e172a4fdd94d834f91503b7344d1840993d62c62190274ec8d7067e9948b7"} Mar 13 12:41:48 crc kubenswrapper[4837]: I0313 12:41:48.013131 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wdpnx" podStartSLOduration=2.524175416 podStartE2EDuration="7.013112227s" podCreationTimestamp="2026-03-13 12:41:41 +0000 UTC" firstStartedPulling="2026-03-13 12:41:42.92053695 +0000 UTC m=+3218.558803713" lastFinishedPulling="2026-03-13 12:41:47.409473761 +0000 UTC m=+3223.047740524" observedRunningTime="2026-03-13 12:41:48.007284303 +0000 UTC m=+3223.645551066" watchObservedRunningTime="2026-03-13 12:41:48.013112227 +0000 UTC m=+3223.651379000" Mar 13 12:41:48 crc kubenswrapper[4837]: I0313 12:41:48.048580 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:41:48 crc kubenswrapper[4837]: E0313 12:41:48.048881 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:41:49 crc kubenswrapper[4837]: I0313 12:41:49.647722 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:49 crc kubenswrapper[4837]: I0313 12:41:49.648088 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:49 crc kubenswrapper[4837]: I0313 12:41:49.693906 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:50 crc kubenswrapper[4837]: I0313 12:41:50.061510 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:51 crc kubenswrapper[4837]: I0313 12:41:51.296254 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xpq8f"] Mar 13 12:41:51 crc kubenswrapper[4837]: I0313 12:41:51.557608 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qxmkr" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" containerName="registry-server" probeResult="failure" output=< Mar 13 12:41:51 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Mar 13 12:41:51 crc kubenswrapper[4837]: > Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.033397 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xpq8f" podUID="7db90d1b-c7eb-4de2-8783-417fd25bdc6f" containerName="registry-server" containerID="cri-o://13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff" gracePeriod=2 Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.043225 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.043296 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.105240 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.491495 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jkb99/must-gather-4lckb"] Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.493079 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/must-gather-4lckb" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.497004 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-jkb99"/"default-dockercfg-6g2c8" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.497004 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jkb99"/"openshift-service-ca.crt" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.497200 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jkb99"/"kube-root-ca.crt" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.507695 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.523136 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jkb99/must-gather-4lckb"] Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.665588 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-utilities\") pod \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\" (UID: \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\") " Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.665660 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5gjp\" (UniqueName: \"kubernetes.io/projected/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-kube-api-access-q5gjp\") pod \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\" (UID: \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\") " Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.665813 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-catalog-content\") pod \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\" (UID: \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\") " Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.666074 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbzbt\" (UniqueName: \"kubernetes.io/projected/8822de14-eaa5-4016-91fd-611718d9b51a-kube-api-access-zbzbt\") pod \"must-gather-4lckb\" (UID: \"8822de14-eaa5-4016-91fd-611718d9b51a\") " pod="openshift-must-gather-jkb99/must-gather-4lckb" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.666187 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8822de14-eaa5-4016-91fd-611718d9b51a-must-gather-output\") pod \"must-gather-4lckb\" (UID: \"8822de14-eaa5-4016-91fd-611718d9b51a\") " pod="openshift-must-gather-jkb99/must-gather-4lckb" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.666854 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-utilities" (OuterVolumeSpecName: "utilities") pod "7db90d1b-c7eb-4de2-8783-417fd25bdc6f" (UID: "7db90d1b-c7eb-4de2-8783-417fd25bdc6f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.681457 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-kube-api-access-q5gjp" (OuterVolumeSpecName: "kube-api-access-q5gjp") pod "7db90d1b-c7eb-4de2-8783-417fd25bdc6f" (UID: "7db90d1b-c7eb-4de2-8783-417fd25bdc6f"). InnerVolumeSpecName "kube-api-access-q5gjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.692265 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7db90d1b-c7eb-4de2-8783-417fd25bdc6f" (UID: "7db90d1b-c7eb-4de2-8783-417fd25bdc6f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.768303 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbzbt\" (UniqueName: \"kubernetes.io/projected/8822de14-eaa5-4016-91fd-611718d9b51a-kube-api-access-zbzbt\") pod \"must-gather-4lckb\" (UID: \"8822de14-eaa5-4016-91fd-611718d9b51a\") " pod="openshift-must-gather-jkb99/must-gather-4lckb" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.768446 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8822de14-eaa5-4016-91fd-611718d9b51a-must-gather-output\") pod \"must-gather-4lckb\" (UID: \"8822de14-eaa5-4016-91fd-611718d9b51a\") " pod="openshift-must-gather-jkb99/must-gather-4lckb" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.768497 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.768510 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5gjp\" (UniqueName: \"kubernetes.io/projected/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-kube-api-access-q5gjp\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.768519 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.768985 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8822de14-eaa5-4016-91fd-611718d9b51a-must-gather-output\") pod \"must-gather-4lckb\" (UID: \"8822de14-eaa5-4016-91fd-611718d9b51a\") " pod="openshift-must-gather-jkb99/must-gather-4lckb" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.799469 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbzbt\" (UniqueName: \"kubernetes.io/projected/8822de14-eaa5-4016-91fd-611718d9b51a-kube-api-access-zbzbt\") pod \"must-gather-4lckb\" (UID: \"8822de14-eaa5-4016-91fd-611718d9b51a\") " pod="openshift-must-gather-jkb99/must-gather-4lckb" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.821826 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/must-gather-4lckb" Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.048499 4837 generic.go:334] "Generic (PLEG): container finished" podID="7db90d1b-c7eb-4de2-8783-417fd25bdc6f" containerID="13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff" exitCode=0 Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.048717 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.068408 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xpq8f" event={"ID":"7db90d1b-c7eb-4de2-8783-417fd25bdc6f","Type":"ContainerDied","Data":"13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff"} Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.068767 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xpq8f" event={"ID":"7db90d1b-c7eb-4de2-8783-417fd25bdc6f","Type":"ContainerDied","Data":"af64dc414a6b7437b27100700bc03b998a48953755354439146e81c31e764a10"} Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.068796 4837 scope.go:117] "RemoveContainer" containerID="13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff" Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.095058 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xpq8f"] Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.104536 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xpq8f"] Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.106591 4837 scope.go:117] "RemoveContainer" containerID="0275010ad2cdf8863266a601f875f06c0f5d019bb4aa2ea17b90064fa41c4bf5" Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.109755 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.132600 4837 scope.go:117] "RemoveContainer" containerID="673af5193713526abe23d3ad06717349574812d657dea12723d9f30f91f6c7a3" Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.149982 4837 scope.go:117] "RemoveContainer" containerID="13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff" Mar 13 12:41:53 crc kubenswrapper[4837]: E0313 12:41:53.151153 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff\": container with ID starting with 13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff not found: ID does not exist" containerID="13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff" Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.151180 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff"} err="failed to get container status \"13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff\": rpc error: code = NotFound desc = could not find container \"13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff\": container with ID starting with 13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff not found: ID does not exist" Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.151198 4837 scope.go:117] "RemoveContainer" containerID="0275010ad2cdf8863266a601f875f06c0f5d019bb4aa2ea17b90064fa41c4bf5" Mar 13 12:41:53 crc kubenswrapper[4837]: E0313 12:41:53.151442 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0275010ad2cdf8863266a601f875f06c0f5d019bb4aa2ea17b90064fa41c4bf5\": container with ID starting with 0275010ad2cdf8863266a601f875f06c0f5d019bb4aa2ea17b90064fa41c4bf5 not found: ID does not exist" containerID="0275010ad2cdf8863266a601f875f06c0f5d019bb4aa2ea17b90064fa41c4bf5" Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.151458 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0275010ad2cdf8863266a601f875f06c0f5d019bb4aa2ea17b90064fa41c4bf5"} err="failed to get container status \"0275010ad2cdf8863266a601f875f06c0f5d019bb4aa2ea17b90064fa41c4bf5\": rpc error: code = NotFound desc = could not find container \"0275010ad2cdf8863266a601f875f06c0f5d019bb4aa2ea17b90064fa41c4bf5\": container with ID starting with 0275010ad2cdf8863266a601f875f06c0f5d019bb4aa2ea17b90064fa41c4bf5 not found: ID does not exist" Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.151470 4837 scope.go:117] "RemoveContainer" containerID="673af5193713526abe23d3ad06717349574812d657dea12723d9f30f91f6c7a3" Mar 13 12:41:53 crc kubenswrapper[4837]: E0313 12:41:53.151759 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"673af5193713526abe23d3ad06717349574812d657dea12723d9f30f91f6c7a3\": container with ID starting with 673af5193713526abe23d3ad06717349574812d657dea12723d9f30f91f6c7a3 not found: ID does not exist" containerID="673af5193713526abe23d3ad06717349574812d657dea12723d9f30f91f6c7a3" Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.151776 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"673af5193713526abe23d3ad06717349574812d657dea12723d9f30f91f6c7a3"} err="failed to get container status \"673af5193713526abe23d3ad06717349574812d657dea12723d9f30f91f6c7a3\": rpc error: code = NotFound desc = could not find container \"673af5193713526abe23d3ad06717349574812d657dea12723d9f30f91f6c7a3\": container with ID starting with 673af5193713526abe23d3ad06717349574812d657dea12723d9f30f91f6c7a3 not found: ID does not exist" Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.268734 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jkb99/must-gather-4lckb"] Mar 13 12:41:54 crc kubenswrapper[4837]: I0313 12:41:54.065685 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jkb99/must-gather-4lckb" event={"ID":"8822de14-eaa5-4016-91fd-611718d9b51a","Type":"ContainerStarted","Data":"fad50242e977d7ae08cb6453193c2359ef62abea67978e1d59225423ce6fb7f7"} Mar 13 12:41:55 crc kubenswrapper[4837]: I0313 12:41:55.066916 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7db90d1b-c7eb-4de2-8783-417fd25bdc6f" path="/var/lib/kubelet/pods/7db90d1b-c7eb-4de2-8783-417fd25bdc6f/volumes" Mar 13 12:41:56 crc kubenswrapper[4837]: I0313 12:41:56.697446 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wdpnx"] Mar 13 12:41:56 crc kubenswrapper[4837]: I0313 12:41:56.697696 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wdpnx" podUID="75dcf43e-e9a3-4956-9582-9663efc7a07b" containerName="registry-server" containerID="cri-o://491e172a4fdd94d834f91503b7344d1840993d62c62190274ec8d7067e9948b7" gracePeriod=2 Mar 13 12:41:57 crc kubenswrapper[4837]: I0313 12:41:57.112352 4837 generic.go:334] "Generic (PLEG): container finished" podID="75dcf43e-e9a3-4956-9582-9663efc7a07b" containerID="491e172a4fdd94d834f91503b7344d1840993d62c62190274ec8d7067e9948b7" exitCode=0 Mar 13 12:41:57 crc kubenswrapper[4837]: I0313 12:41:57.112415 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdpnx" event={"ID":"75dcf43e-e9a3-4956-9582-9663efc7a07b","Type":"ContainerDied","Data":"491e172a4fdd94d834f91503b7344d1840993d62c62190274ec8d7067e9948b7"} Mar 13 12:41:59 crc kubenswrapper[4837]: I0313 12:41:59.069135 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:41:59 crc kubenswrapper[4837]: E0313 12:41:59.070086 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:41:59 crc kubenswrapper[4837]: I0313 12:41:59.740663 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:59 crc kubenswrapper[4837]: I0313 12:41:59.855507 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75dcf43e-e9a3-4956-9582-9663efc7a07b-catalog-content\") pod \"75dcf43e-e9a3-4956-9582-9663efc7a07b\" (UID: \"75dcf43e-e9a3-4956-9582-9663efc7a07b\") " Mar 13 12:41:59 crc kubenswrapper[4837]: I0313 12:41:59.855792 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cmm5\" (UniqueName: \"kubernetes.io/projected/75dcf43e-e9a3-4956-9582-9663efc7a07b-kube-api-access-8cmm5\") pod \"75dcf43e-e9a3-4956-9582-9663efc7a07b\" (UID: \"75dcf43e-e9a3-4956-9582-9663efc7a07b\") " Mar 13 12:41:59 crc kubenswrapper[4837]: I0313 12:41:59.855814 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75dcf43e-e9a3-4956-9582-9663efc7a07b-utilities\") pod \"75dcf43e-e9a3-4956-9582-9663efc7a07b\" (UID: \"75dcf43e-e9a3-4956-9582-9663efc7a07b\") " Mar 13 12:41:59 crc kubenswrapper[4837]: I0313 12:41:59.857686 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75dcf43e-e9a3-4956-9582-9663efc7a07b-utilities" (OuterVolumeSpecName: "utilities") pod "75dcf43e-e9a3-4956-9582-9663efc7a07b" (UID: "75dcf43e-e9a3-4956-9582-9663efc7a07b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:41:59 crc kubenswrapper[4837]: I0313 12:41:59.861408 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75dcf43e-e9a3-4956-9582-9663efc7a07b-kube-api-access-8cmm5" (OuterVolumeSpecName: "kube-api-access-8cmm5") pod "75dcf43e-e9a3-4956-9582-9663efc7a07b" (UID: "75dcf43e-e9a3-4956-9582-9663efc7a07b"). InnerVolumeSpecName "kube-api-access-8cmm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:41:59 crc kubenswrapper[4837]: I0313 12:41:59.906175 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75dcf43e-e9a3-4956-9582-9663efc7a07b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75dcf43e-e9a3-4956-9582-9663efc7a07b" (UID: "75dcf43e-e9a3-4956-9582-9663efc7a07b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:41:59 crc kubenswrapper[4837]: I0313 12:41:59.958057 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cmm5\" (UniqueName: \"kubernetes.io/projected/75dcf43e-e9a3-4956-9582-9663efc7a07b-kube-api-access-8cmm5\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:59 crc kubenswrapper[4837]: I0313 12:41:59.958102 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75dcf43e-e9a3-4956-9582-9663efc7a07b-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:59 crc kubenswrapper[4837]: I0313 12:41:59.958117 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75dcf43e-e9a3-4956-9582-9663efc7a07b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.157762 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdpnx" event={"ID":"75dcf43e-e9a3-4956-9582-9663efc7a07b","Type":"ContainerDied","Data":"e5e7a4dd77c0df97d0b938b4b0928d9a15ca84c095db93e2aee641e194eec7e0"} Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.158088 4837 scope.go:117] "RemoveContainer" containerID="491e172a4fdd94d834f91503b7344d1840993d62c62190274ec8d7067e9948b7" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.157811 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.164027 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556762-g52qb"] Mar 13 12:42:00 crc kubenswrapper[4837]: E0313 12:42:00.164620 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75dcf43e-e9a3-4956-9582-9663efc7a07b" containerName="registry-server" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.164663 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="75dcf43e-e9a3-4956-9582-9663efc7a07b" containerName="registry-server" Mar 13 12:42:00 crc kubenswrapper[4837]: E0313 12:42:00.164676 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75dcf43e-e9a3-4956-9582-9663efc7a07b" containerName="extract-content" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.164682 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="75dcf43e-e9a3-4956-9582-9663efc7a07b" containerName="extract-content" Mar 13 12:42:00 crc kubenswrapper[4837]: E0313 12:42:00.164693 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db90d1b-c7eb-4de2-8783-417fd25bdc6f" containerName="extract-utilities" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.164700 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db90d1b-c7eb-4de2-8783-417fd25bdc6f" containerName="extract-utilities" Mar 13 12:42:00 crc kubenswrapper[4837]: E0313 12:42:00.164712 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75dcf43e-e9a3-4956-9582-9663efc7a07b" containerName="extract-utilities" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.164742 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="75dcf43e-e9a3-4956-9582-9663efc7a07b" containerName="extract-utilities" Mar 13 12:42:00 crc kubenswrapper[4837]: E0313 12:42:00.164762 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db90d1b-c7eb-4de2-8783-417fd25bdc6f" containerName="registry-server" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.164768 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db90d1b-c7eb-4de2-8783-417fd25bdc6f" containerName="registry-server" Mar 13 12:42:00 crc kubenswrapper[4837]: E0313 12:42:00.164780 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db90d1b-c7eb-4de2-8783-417fd25bdc6f" containerName="extract-content" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.164785 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db90d1b-c7eb-4de2-8783-417fd25bdc6f" containerName="extract-content" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.165019 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="75dcf43e-e9a3-4956-9582-9663efc7a07b" containerName="registry-server" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.165035 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="7db90d1b-c7eb-4de2-8783-417fd25bdc6f" containerName="registry-server" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.165963 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556762-g52qb" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.175722 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556762-g52qb"] Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.198631 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.198760 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.199550 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.203231 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jkb99/must-gather-4lckb" event={"ID":"8822de14-eaa5-4016-91fd-611718d9b51a","Type":"ContainerStarted","Data":"46bee07e0cc64861f34813174541cff76485e4b8cd9b5fb84ab93fd9eff59fed"} Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.233436 4837 scope.go:117] "RemoveContainer" containerID="6f8327024dd21bd081ba023805b2e005959158663382bf7c861522d50eaf9255" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.238992 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wdpnx"] Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.262724 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wdpnx"] Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.263738 4837 scope.go:117] "RemoveContainer" containerID="1cdd21f387fca0b63738dbf1a676060e6b1a2034a411336f47b42f6f70b348d1" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.264836 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpj48\" (UniqueName: \"kubernetes.io/projected/e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4-kube-api-access-tpj48\") pod \"auto-csr-approver-29556762-g52qb\" (UID: \"e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4\") " pod="openshift-infra/auto-csr-approver-29556762-g52qb" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.367190 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpj48\" (UniqueName: \"kubernetes.io/projected/e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4-kube-api-access-tpj48\") pod \"auto-csr-approver-29556762-g52qb\" (UID: \"e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4\") " pod="openshift-infra/auto-csr-approver-29556762-g52qb" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.383454 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpj48\" (UniqueName: \"kubernetes.io/projected/e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4-kube-api-access-tpj48\") pod \"auto-csr-approver-29556762-g52qb\" (UID: \"e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4\") " pod="openshift-infra/auto-csr-approver-29556762-g52qb" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.522080 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556762-g52qb" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.972745 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556762-g52qb"] Mar 13 12:42:01 crc kubenswrapper[4837]: I0313 12:42:01.058510 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75dcf43e-e9a3-4956-9582-9663efc7a07b" path="/var/lib/kubelet/pods/75dcf43e-e9a3-4956-9582-9663efc7a07b/volumes" Mar 13 12:42:01 crc kubenswrapper[4837]: I0313 12:42:01.212661 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556762-g52qb" event={"ID":"e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4","Type":"ContainerStarted","Data":"f39f1cc1116054070489cd8e691ba3a60eebf43d9d1a29eb7b4eab84d3b1b1f3"} Mar 13 12:42:01 crc kubenswrapper[4837]: I0313 12:42:01.214414 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jkb99/must-gather-4lckb" event={"ID":"8822de14-eaa5-4016-91fd-611718d9b51a","Type":"ContainerStarted","Data":"d7374ab200a788a99f53fe2448f4035d1be2d4984c27b0031e0578210408765b"} Mar 13 12:42:01 crc kubenswrapper[4837]: I0313 12:42:01.234808 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jkb99/must-gather-4lckb" podStartSLOduration=2.788961667 podStartE2EDuration="9.234791402s" podCreationTimestamp="2026-03-13 12:41:52 +0000 UTC" firstStartedPulling="2026-03-13 12:41:53.277113075 +0000 UTC m=+3228.915379848" lastFinishedPulling="2026-03-13 12:41:59.72294282 +0000 UTC m=+3235.361209583" observedRunningTime="2026-03-13 12:42:01.228823223 +0000 UTC m=+3236.867089986" watchObservedRunningTime="2026-03-13 12:42:01.234791402 +0000 UTC m=+3236.873058165" Mar 13 12:42:01 crc kubenswrapper[4837]: I0313 12:42:01.556355 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qxmkr" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" containerName="registry-server" probeResult="failure" output=< Mar 13 12:42:01 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Mar 13 12:42:01 crc kubenswrapper[4837]: > Mar 13 12:42:02 crc kubenswrapper[4837]: I0313 12:42:02.243064 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556762-g52qb" event={"ID":"e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4","Type":"ContainerStarted","Data":"d6ca53672f75fdcf8f31c32bb76f3e903dae1282d3f22ff4ff5cc9e6da3282e1"} Mar 13 12:42:02 crc kubenswrapper[4837]: I0313 12:42:02.262386 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556762-g52qb" podStartSLOduration=1.457253045 podStartE2EDuration="2.262366872s" podCreationTimestamp="2026-03-13 12:42:00 +0000 UTC" firstStartedPulling="2026-03-13 12:42:00.983808746 +0000 UTC m=+3236.622075549" lastFinishedPulling="2026-03-13 12:42:01.788922603 +0000 UTC m=+3237.427189376" observedRunningTime="2026-03-13 12:42:02.256726184 +0000 UTC m=+3237.894992947" watchObservedRunningTime="2026-03-13 12:42:02.262366872 +0000 UTC m=+3237.900633635" Mar 13 12:42:02 crc kubenswrapper[4837]: E0313 12:42:02.710706 4837 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.138:40232->38.102.83.138:43005: read tcp 38.102.83.138:40232->38.102.83.138:43005: read: connection reset by peer Mar 13 12:42:03 crc kubenswrapper[4837]: I0313 12:42:03.255982 4837 generic.go:334] "Generic (PLEG): container finished" podID="e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4" containerID="d6ca53672f75fdcf8f31c32bb76f3e903dae1282d3f22ff4ff5cc9e6da3282e1" exitCode=0 Mar 13 12:42:03 crc kubenswrapper[4837]: I0313 12:42:03.256044 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556762-g52qb" event={"ID":"e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4","Type":"ContainerDied","Data":"d6ca53672f75fdcf8f31c32bb76f3e903dae1282d3f22ff4ff5cc9e6da3282e1"} Mar 13 12:42:03 crc kubenswrapper[4837]: I0313 12:42:03.838332 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jkb99/crc-debug-wvz2q"] Mar 13 12:42:03 crc kubenswrapper[4837]: I0313 12:42:03.840171 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/crc-debug-wvz2q" Mar 13 12:42:03 crc kubenswrapper[4837]: I0313 12:42:03.937271 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsfj2\" (UniqueName: \"kubernetes.io/projected/50b16782-2e4c-48fd-ba3e-f0557fdbaae8-kube-api-access-nsfj2\") pod \"crc-debug-wvz2q\" (UID: \"50b16782-2e4c-48fd-ba3e-f0557fdbaae8\") " pod="openshift-must-gather-jkb99/crc-debug-wvz2q" Mar 13 12:42:03 crc kubenswrapper[4837]: I0313 12:42:03.937530 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50b16782-2e4c-48fd-ba3e-f0557fdbaae8-host\") pod \"crc-debug-wvz2q\" (UID: \"50b16782-2e4c-48fd-ba3e-f0557fdbaae8\") " pod="openshift-must-gather-jkb99/crc-debug-wvz2q" Mar 13 12:42:04 crc kubenswrapper[4837]: I0313 12:42:04.039695 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsfj2\" (UniqueName: \"kubernetes.io/projected/50b16782-2e4c-48fd-ba3e-f0557fdbaae8-kube-api-access-nsfj2\") pod \"crc-debug-wvz2q\" (UID: \"50b16782-2e4c-48fd-ba3e-f0557fdbaae8\") " pod="openshift-must-gather-jkb99/crc-debug-wvz2q" Mar 13 12:42:04 crc kubenswrapper[4837]: I0313 12:42:04.039755 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50b16782-2e4c-48fd-ba3e-f0557fdbaae8-host\") pod \"crc-debug-wvz2q\" (UID: \"50b16782-2e4c-48fd-ba3e-f0557fdbaae8\") " pod="openshift-must-gather-jkb99/crc-debug-wvz2q" Mar 13 12:42:04 crc kubenswrapper[4837]: I0313 12:42:04.039936 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50b16782-2e4c-48fd-ba3e-f0557fdbaae8-host\") pod \"crc-debug-wvz2q\" (UID: \"50b16782-2e4c-48fd-ba3e-f0557fdbaae8\") " pod="openshift-must-gather-jkb99/crc-debug-wvz2q" Mar 13 12:42:04 crc kubenswrapper[4837]: I0313 12:42:04.058420 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsfj2\" (UniqueName: \"kubernetes.io/projected/50b16782-2e4c-48fd-ba3e-f0557fdbaae8-kube-api-access-nsfj2\") pod \"crc-debug-wvz2q\" (UID: \"50b16782-2e4c-48fd-ba3e-f0557fdbaae8\") " pod="openshift-must-gather-jkb99/crc-debug-wvz2q" Mar 13 12:42:04 crc kubenswrapper[4837]: I0313 12:42:04.158721 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/crc-debug-wvz2q" Mar 13 12:42:04 crc kubenswrapper[4837]: W0313 12:42:04.192230 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50b16782_2e4c_48fd_ba3e_f0557fdbaae8.slice/crio-e787c8c6bb3dbe5c16cf7bc30198ada072e5d26721721acf3e78813724296110 WatchSource:0}: Error finding container e787c8c6bb3dbe5c16cf7bc30198ada072e5d26721721acf3e78813724296110: Status 404 returned error can't find the container with id e787c8c6bb3dbe5c16cf7bc30198ada072e5d26721721acf3e78813724296110 Mar 13 12:42:04 crc kubenswrapper[4837]: I0313 12:42:04.266273 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jkb99/crc-debug-wvz2q" event={"ID":"50b16782-2e4c-48fd-ba3e-f0557fdbaae8","Type":"ContainerStarted","Data":"e787c8c6bb3dbe5c16cf7bc30198ada072e5d26721721acf3e78813724296110"} Mar 13 12:42:04 crc kubenswrapper[4837]: I0313 12:42:04.655206 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556762-g52qb" Mar 13 12:42:04 crc kubenswrapper[4837]: I0313 12:42:04.751805 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpj48\" (UniqueName: \"kubernetes.io/projected/e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4-kube-api-access-tpj48\") pod \"e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4\" (UID: \"e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4\") " Mar 13 12:42:04 crc kubenswrapper[4837]: I0313 12:42:04.758582 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4-kube-api-access-tpj48" (OuterVolumeSpecName: "kube-api-access-tpj48") pod "e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4" (UID: "e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4"). InnerVolumeSpecName "kube-api-access-tpj48". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:42:04 crc kubenswrapper[4837]: I0313 12:42:04.854630 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpj48\" (UniqueName: \"kubernetes.io/projected/e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4-kube-api-access-tpj48\") on node \"crc\" DevicePath \"\"" Mar 13 12:42:05 crc kubenswrapper[4837]: I0313 12:42:05.278167 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556762-g52qb" event={"ID":"e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4","Type":"ContainerDied","Data":"f39f1cc1116054070489cd8e691ba3a60eebf43d9d1a29eb7b4eab84d3b1b1f3"} Mar 13 12:42:05 crc kubenswrapper[4837]: I0313 12:42:05.278205 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f39f1cc1116054070489cd8e691ba3a60eebf43d9d1a29eb7b4eab84d3b1b1f3" Mar 13 12:42:05 crc kubenswrapper[4837]: I0313 12:42:05.278221 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556762-g52qb" Mar 13 12:42:05 crc kubenswrapper[4837]: I0313 12:42:05.328527 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556756-xz7n5"] Mar 13 12:42:05 crc kubenswrapper[4837]: I0313 12:42:05.341751 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556756-xz7n5"] Mar 13 12:42:07 crc kubenswrapper[4837]: I0313 12:42:07.066247 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8833ed1c-80bb-4529-9f4a-6109d1a39f13" path="/var/lib/kubelet/pods/8833ed1c-80bb-4529-9f4a-6109d1a39f13/volumes" Mar 13 12:42:11 crc kubenswrapper[4837]: I0313 12:42:11.543810 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qxmkr" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" containerName="registry-server" probeResult="failure" output=< Mar 13 12:42:11 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Mar 13 12:42:11 crc kubenswrapper[4837]: > Mar 13 12:42:12 crc kubenswrapper[4837]: I0313 12:42:12.049706 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:42:12 crc kubenswrapper[4837]: E0313 12:42:12.050017 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:42:16 crc kubenswrapper[4837]: I0313 12:42:16.386390 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jkb99/crc-debug-wvz2q" event={"ID":"50b16782-2e4c-48fd-ba3e-f0557fdbaae8","Type":"ContainerStarted","Data":"5a28fcc3eaaccd2193eb19c5505852014f77b072025f81e4dfef900c7784ce98"} Mar 13 12:42:16 crc kubenswrapper[4837]: I0313 12:42:16.403946 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jkb99/crc-debug-wvz2q" podStartSLOduration=1.8321656339999999 podStartE2EDuration="13.403925032s" podCreationTimestamp="2026-03-13 12:42:03 +0000 UTC" firstStartedPulling="2026-03-13 12:42:04.195937158 +0000 UTC m=+3239.834203921" lastFinishedPulling="2026-03-13 12:42:15.767696556 +0000 UTC m=+3251.405963319" observedRunningTime="2026-03-13 12:42:16.396659942 +0000 UTC m=+3252.034926715" watchObservedRunningTime="2026-03-13 12:42:16.403925032 +0000 UTC m=+3252.042191795" Mar 13 12:42:20 crc kubenswrapper[4837]: I0313 12:42:20.543161 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:42:20 crc kubenswrapper[4837]: I0313 12:42:20.591869 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:42:20 crc kubenswrapper[4837]: I0313 12:42:20.783136 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qxmkr"] Mar 13 12:42:22 crc kubenswrapper[4837]: I0313 12:42:22.430799 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qxmkr" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" containerName="registry-server" containerID="cri-o://d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f" gracePeriod=2 Mar 13 12:42:22 crc kubenswrapper[4837]: I0313 12:42:22.914981 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.019412 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7daa3751-d057-474f-9a0f-79fdada329a2-utilities\") pod \"7daa3751-d057-474f-9a0f-79fdada329a2\" (UID: \"7daa3751-d057-474f-9a0f-79fdada329a2\") " Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.019889 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7daa3751-d057-474f-9a0f-79fdada329a2-catalog-content\") pod \"7daa3751-d057-474f-9a0f-79fdada329a2\" (UID: \"7daa3751-d057-474f-9a0f-79fdada329a2\") " Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.019954 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz5c2\" (UniqueName: \"kubernetes.io/projected/7daa3751-d057-474f-9a0f-79fdada329a2-kube-api-access-mz5c2\") pod \"7daa3751-d057-474f-9a0f-79fdada329a2\" (UID: \"7daa3751-d057-474f-9a0f-79fdada329a2\") " Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.020509 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7daa3751-d057-474f-9a0f-79fdada329a2-utilities" (OuterVolumeSpecName: "utilities") pod "7daa3751-d057-474f-9a0f-79fdada329a2" (UID: "7daa3751-d057-474f-9a0f-79fdada329a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.038370 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7daa3751-d057-474f-9a0f-79fdada329a2-kube-api-access-mz5c2" (OuterVolumeSpecName: "kube-api-access-mz5c2") pod "7daa3751-d057-474f-9a0f-79fdada329a2" (UID: "7daa3751-d057-474f-9a0f-79fdada329a2"). InnerVolumeSpecName "kube-api-access-mz5c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.122764 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7daa3751-d057-474f-9a0f-79fdada329a2-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.122992 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz5c2\" (UniqueName: \"kubernetes.io/projected/7daa3751-d057-474f-9a0f-79fdada329a2-kube-api-access-mz5c2\") on node \"crc\" DevicePath \"\"" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.140542 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7daa3751-d057-474f-9a0f-79fdada329a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7daa3751-d057-474f-9a0f-79fdada329a2" (UID: "7daa3751-d057-474f-9a0f-79fdada329a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.224384 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7daa3751-d057-474f-9a0f-79fdada329a2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.440415 4837 generic.go:334] "Generic (PLEG): container finished" podID="7daa3751-d057-474f-9a0f-79fdada329a2" containerID="d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f" exitCode=0 Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.440459 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxmkr" event={"ID":"7daa3751-d057-474f-9a0f-79fdada329a2","Type":"ContainerDied","Data":"d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f"} Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.440509 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxmkr" event={"ID":"7daa3751-d057-474f-9a0f-79fdada329a2","Type":"ContainerDied","Data":"d198cd5e4de064380ab84dfd25c9bf6271c0c2b325c9d7fcdbcadff172704d51"} Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.440515 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.440527 4837 scope.go:117] "RemoveContainer" containerID="d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.459989 4837 scope.go:117] "RemoveContainer" containerID="75c1c0316d400d5f2fc848d84d2ba2bd6d6be3266e3bc1c32dfc91a9178306cc" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.485700 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qxmkr"] Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.488776 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qxmkr"] Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.489041 4837 scope.go:117] "RemoveContainer" containerID="1b2c7f9c7376794e5560f9e528d8a765c014e6561edefe605eb3c2dd6333a2a0" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.534832 4837 scope.go:117] "RemoveContainer" containerID="d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f" Mar 13 12:42:23 crc kubenswrapper[4837]: E0313 12:42:23.535182 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f\": container with ID starting with d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f not found: ID does not exist" containerID="d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.535212 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f"} err="failed to get container status \"d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f\": rpc error: code = NotFound desc = could not find container \"d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f\": container with ID starting with d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f not found: ID does not exist" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.535231 4837 scope.go:117] "RemoveContainer" containerID="75c1c0316d400d5f2fc848d84d2ba2bd6d6be3266e3bc1c32dfc91a9178306cc" Mar 13 12:42:23 crc kubenswrapper[4837]: E0313 12:42:23.535487 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75c1c0316d400d5f2fc848d84d2ba2bd6d6be3266e3bc1c32dfc91a9178306cc\": container with ID starting with 75c1c0316d400d5f2fc848d84d2ba2bd6d6be3266e3bc1c32dfc91a9178306cc not found: ID does not exist" containerID="75c1c0316d400d5f2fc848d84d2ba2bd6d6be3266e3bc1c32dfc91a9178306cc" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.535514 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75c1c0316d400d5f2fc848d84d2ba2bd6d6be3266e3bc1c32dfc91a9178306cc"} err="failed to get container status \"75c1c0316d400d5f2fc848d84d2ba2bd6d6be3266e3bc1c32dfc91a9178306cc\": rpc error: code = NotFound desc = could not find container \"75c1c0316d400d5f2fc848d84d2ba2bd6d6be3266e3bc1c32dfc91a9178306cc\": container with ID starting with 75c1c0316d400d5f2fc848d84d2ba2bd6d6be3266e3bc1c32dfc91a9178306cc not found: ID does not exist" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.535527 4837 scope.go:117] "RemoveContainer" containerID="1b2c7f9c7376794e5560f9e528d8a765c014e6561edefe605eb3c2dd6333a2a0" Mar 13 12:42:23 crc kubenswrapper[4837]: E0313 12:42:23.535766 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b2c7f9c7376794e5560f9e528d8a765c014e6561edefe605eb3c2dd6333a2a0\": container with ID starting with 1b2c7f9c7376794e5560f9e528d8a765c014e6561edefe605eb3c2dd6333a2a0 not found: ID does not exist" containerID="1b2c7f9c7376794e5560f9e528d8a765c014e6561edefe605eb3c2dd6333a2a0" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.535796 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b2c7f9c7376794e5560f9e528d8a765c014e6561edefe605eb3c2dd6333a2a0"} err="failed to get container status \"1b2c7f9c7376794e5560f9e528d8a765c014e6561edefe605eb3c2dd6333a2a0\": rpc error: code = NotFound desc = could not find container \"1b2c7f9c7376794e5560f9e528d8a765c014e6561edefe605eb3c2dd6333a2a0\": container with ID starting with 1b2c7f9c7376794e5560f9e528d8a765c014e6561edefe605eb3c2dd6333a2a0 not found: ID does not exist" Mar 13 12:42:25 crc kubenswrapper[4837]: I0313 12:42:25.062451 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" path="/var/lib/kubelet/pods/7daa3751-d057-474f-9a0f-79fdada329a2/volumes" Mar 13 12:42:27 crc kubenswrapper[4837]: I0313 12:42:27.048346 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:42:27 crc kubenswrapper[4837]: E0313 12:42:27.049305 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:42:31 crc kubenswrapper[4837]: I0313 12:42:31.392334 4837 scope.go:117] "RemoveContainer" containerID="8991bbc909e2098b2d6fb047c31dca6c613e8c861798107378538f426d77e480" Mar 13 12:42:38 crc kubenswrapper[4837]: I0313 12:42:38.048951 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:42:38 crc kubenswrapper[4837]: I0313 12:42:38.575989 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"42895e7497f11e52c6189b0227f4673591fee559ac68adfdd28355562f8112bd"} Mar 13 12:42:57 crc kubenswrapper[4837]: I0313 12:42:57.767249 4837 generic.go:334] "Generic (PLEG): container finished" podID="50b16782-2e4c-48fd-ba3e-f0557fdbaae8" containerID="5a28fcc3eaaccd2193eb19c5505852014f77b072025f81e4dfef900c7784ce98" exitCode=0 Mar 13 12:42:57 crc kubenswrapper[4837]: I0313 12:42:57.767314 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jkb99/crc-debug-wvz2q" event={"ID":"50b16782-2e4c-48fd-ba3e-f0557fdbaae8","Type":"ContainerDied","Data":"5a28fcc3eaaccd2193eb19c5505852014f77b072025f81e4dfef900c7784ce98"} Mar 13 12:42:58 crc kubenswrapper[4837]: I0313 12:42:58.868631 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/crc-debug-wvz2q" Mar 13 12:42:58 crc kubenswrapper[4837]: I0313 12:42:58.897392 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jkb99/crc-debug-wvz2q"] Mar 13 12:42:58 crc kubenswrapper[4837]: I0313 12:42:58.904939 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jkb99/crc-debug-wvz2q"] Mar 13 12:42:59 crc kubenswrapper[4837]: I0313 12:42:59.048068 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsfj2\" (UniqueName: \"kubernetes.io/projected/50b16782-2e4c-48fd-ba3e-f0557fdbaae8-kube-api-access-nsfj2\") pod \"50b16782-2e4c-48fd-ba3e-f0557fdbaae8\" (UID: \"50b16782-2e4c-48fd-ba3e-f0557fdbaae8\") " Mar 13 12:42:59 crc kubenswrapper[4837]: I0313 12:42:59.049069 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50b16782-2e4c-48fd-ba3e-f0557fdbaae8-host\") pod \"50b16782-2e4c-48fd-ba3e-f0557fdbaae8\" (UID: \"50b16782-2e4c-48fd-ba3e-f0557fdbaae8\") " Mar 13 12:42:59 crc kubenswrapper[4837]: I0313 12:42:59.049228 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50b16782-2e4c-48fd-ba3e-f0557fdbaae8-host" (OuterVolumeSpecName: "host") pod "50b16782-2e4c-48fd-ba3e-f0557fdbaae8" (UID: "50b16782-2e4c-48fd-ba3e-f0557fdbaae8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:42:59 crc kubenswrapper[4837]: I0313 12:42:59.050305 4837 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50b16782-2e4c-48fd-ba3e-f0557fdbaae8-host\") on node \"crc\" DevicePath \"\"" Mar 13 12:42:59 crc kubenswrapper[4837]: I0313 12:42:59.059228 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50b16782-2e4c-48fd-ba3e-f0557fdbaae8-kube-api-access-nsfj2" (OuterVolumeSpecName: "kube-api-access-nsfj2") pod "50b16782-2e4c-48fd-ba3e-f0557fdbaae8" (UID: "50b16782-2e4c-48fd-ba3e-f0557fdbaae8"). InnerVolumeSpecName "kube-api-access-nsfj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:42:59 crc kubenswrapper[4837]: I0313 12:42:59.069722 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50b16782-2e4c-48fd-ba3e-f0557fdbaae8" path="/var/lib/kubelet/pods/50b16782-2e4c-48fd-ba3e-f0557fdbaae8/volumes" Mar 13 12:42:59 crc kubenswrapper[4837]: I0313 12:42:59.152492 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsfj2\" (UniqueName: \"kubernetes.io/projected/50b16782-2e4c-48fd-ba3e-f0557fdbaae8-kube-api-access-nsfj2\") on node \"crc\" DevicePath \"\"" Mar 13 12:42:59 crc kubenswrapper[4837]: I0313 12:42:59.792940 4837 scope.go:117] "RemoveContainer" containerID="5a28fcc3eaaccd2193eb19c5505852014f77b072025f81e4dfef900c7784ce98" Mar 13 12:42:59 crc kubenswrapper[4837]: I0313 12:42:59.792976 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/crc-debug-wvz2q" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.129081 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jkb99/crc-debug-qdfmc"] Mar 13 12:43:00 crc kubenswrapper[4837]: E0313 12:43:00.130000 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" containerName="extract-content" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.130022 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" containerName="extract-content" Mar 13 12:43:00 crc kubenswrapper[4837]: E0313 12:43:00.130059 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" containerName="extract-utilities" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.130069 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" containerName="extract-utilities" Mar 13 12:43:00 crc kubenswrapper[4837]: E0313 12:43:00.130091 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50b16782-2e4c-48fd-ba3e-f0557fdbaae8" containerName="container-00" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.130106 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b16782-2e4c-48fd-ba3e-f0557fdbaae8" containerName="container-00" Mar 13 12:43:00 crc kubenswrapper[4837]: E0313 12:43:00.130128 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4" containerName="oc" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.130138 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4" containerName="oc" Mar 13 12:43:00 crc kubenswrapper[4837]: E0313 12:43:00.130165 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" containerName="registry-server" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.130174 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" containerName="registry-server" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.130467 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" containerName="registry-server" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.130493 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="50b16782-2e4c-48fd-ba3e-f0557fdbaae8" containerName="container-00" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.130534 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4" containerName="oc" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.131443 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/crc-debug-qdfmc" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.173014 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58h52\" (UniqueName: \"kubernetes.io/projected/8b157bd4-1b09-44fc-ba60-6b9f3e008253-kube-api-access-58h52\") pod \"crc-debug-qdfmc\" (UID: \"8b157bd4-1b09-44fc-ba60-6b9f3e008253\") " pod="openshift-must-gather-jkb99/crc-debug-qdfmc" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.173137 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b157bd4-1b09-44fc-ba60-6b9f3e008253-host\") pod \"crc-debug-qdfmc\" (UID: \"8b157bd4-1b09-44fc-ba60-6b9f3e008253\") " pod="openshift-must-gather-jkb99/crc-debug-qdfmc" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.274619 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b157bd4-1b09-44fc-ba60-6b9f3e008253-host\") pod \"crc-debug-qdfmc\" (UID: \"8b157bd4-1b09-44fc-ba60-6b9f3e008253\") " pod="openshift-must-gather-jkb99/crc-debug-qdfmc" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.274744 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b157bd4-1b09-44fc-ba60-6b9f3e008253-host\") pod \"crc-debug-qdfmc\" (UID: \"8b157bd4-1b09-44fc-ba60-6b9f3e008253\") " pod="openshift-must-gather-jkb99/crc-debug-qdfmc" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.274771 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58h52\" (UniqueName: \"kubernetes.io/projected/8b157bd4-1b09-44fc-ba60-6b9f3e008253-kube-api-access-58h52\") pod \"crc-debug-qdfmc\" (UID: \"8b157bd4-1b09-44fc-ba60-6b9f3e008253\") " pod="openshift-must-gather-jkb99/crc-debug-qdfmc" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.296655 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58h52\" (UniqueName: \"kubernetes.io/projected/8b157bd4-1b09-44fc-ba60-6b9f3e008253-kube-api-access-58h52\") pod \"crc-debug-qdfmc\" (UID: \"8b157bd4-1b09-44fc-ba60-6b9f3e008253\") " pod="openshift-must-gather-jkb99/crc-debug-qdfmc" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.452229 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/crc-debug-qdfmc" Mar 13 12:43:00 crc kubenswrapper[4837]: W0313 12:43:00.485369 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b157bd4_1b09_44fc_ba60_6b9f3e008253.slice/crio-bb188d45448e9137888500bafe79ed93c9d6da36a90e2a65f056bd2b9406ffbc WatchSource:0}: Error finding container bb188d45448e9137888500bafe79ed93c9d6da36a90e2a65f056bd2b9406ffbc: Status 404 returned error can't find the container with id bb188d45448e9137888500bafe79ed93c9d6da36a90e2a65f056bd2b9406ffbc Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.801766 4837 generic.go:334] "Generic (PLEG): container finished" podID="8b157bd4-1b09-44fc-ba60-6b9f3e008253" containerID="021f2f7590a98a1912559c67d885639fef8ea6affc1fcb856c58211036ebcb42" exitCode=0 Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.801927 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jkb99/crc-debug-qdfmc" event={"ID":"8b157bd4-1b09-44fc-ba60-6b9f3e008253","Type":"ContainerDied","Data":"021f2f7590a98a1912559c67d885639fef8ea6affc1fcb856c58211036ebcb42"} Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.802110 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jkb99/crc-debug-qdfmc" event={"ID":"8b157bd4-1b09-44fc-ba60-6b9f3e008253","Type":"ContainerStarted","Data":"bb188d45448e9137888500bafe79ed93c9d6da36a90e2a65f056bd2b9406ffbc"} Mar 13 12:43:01 crc kubenswrapper[4837]: I0313 12:43:01.278090 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jkb99/crc-debug-qdfmc"] Mar 13 12:43:01 crc kubenswrapper[4837]: I0313 12:43:01.288846 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jkb99/crc-debug-qdfmc"] Mar 13 12:43:01 crc kubenswrapper[4837]: I0313 12:43:01.925920 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/crc-debug-qdfmc" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.011027 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58h52\" (UniqueName: \"kubernetes.io/projected/8b157bd4-1b09-44fc-ba60-6b9f3e008253-kube-api-access-58h52\") pod \"8b157bd4-1b09-44fc-ba60-6b9f3e008253\" (UID: \"8b157bd4-1b09-44fc-ba60-6b9f3e008253\") " Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.011119 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b157bd4-1b09-44fc-ba60-6b9f3e008253-host\") pod \"8b157bd4-1b09-44fc-ba60-6b9f3e008253\" (UID: \"8b157bd4-1b09-44fc-ba60-6b9f3e008253\") " Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.011679 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b157bd4-1b09-44fc-ba60-6b9f3e008253-host" (OuterVolumeSpecName: "host") pod "8b157bd4-1b09-44fc-ba60-6b9f3e008253" (UID: "8b157bd4-1b09-44fc-ba60-6b9f3e008253"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.011809 4837 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b157bd4-1b09-44fc-ba60-6b9f3e008253-host\") on node \"crc\" DevicePath \"\"" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.017327 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b157bd4-1b09-44fc-ba60-6b9f3e008253-kube-api-access-58h52" (OuterVolumeSpecName: "kube-api-access-58h52") pod "8b157bd4-1b09-44fc-ba60-6b9f3e008253" (UID: "8b157bd4-1b09-44fc-ba60-6b9f3e008253"). InnerVolumeSpecName "kube-api-access-58h52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.113803 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58h52\" (UniqueName: \"kubernetes.io/projected/8b157bd4-1b09-44fc-ba60-6b9f3e008253-kube-api-access-58h52\") on node \"crc\" DevicePath \"\"" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.447851 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jkb99/crc-debug-jtfh6"] Mar 13 12:43:02 crc kubenswrapper[4837]: E0313 12:43:02.448218 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b157bd4-1b09-44fc-ba60-6b9f3e008253" containerName="container-00" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.448235 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b157bd4-1b09-44fc-ba60-6b9f3e008253" containerName="container-00" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.448463 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b157bd4-1b09-44fc-ba60-6b9f3e008253" containerName="container-00" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.449128 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/crc-debug-jtfh6" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.521195 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3157eabd-f1a5-4ab2-b3ac-aae960131503-host\") pod \"crc-debug-jtfh6\" (UID: \"3157eabd-f1a5-4ab2-b3ac-aae960131503\") " pod="openshift-must-gather-jkb99/crc-debug-jtfh6" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.521354 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59rxt\" (UniqueName: \"kubernetes.io/projected/3157eabd-f1a5-4ab2-b3ac-aae960131503-kube-api-access-59rxt\") pod \"crc-debug-jtfh6\" (UID: \"3157eabd-f1a5-4ab2-b3ac-aae960131503\") " pod="openshift-must-gather-jkb99/crc-debug-jtfh6" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.622794 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59rxt\" (UniqueName: \"kubernetes.io/projected/3157eabd-f1a5-4ab2-b3ac-aae960131503-kube-api-access-59rxt\") pod \"crc-debug-jtfh6\" (UID: \"3157eabd-f1a5-4ab2-b3ac-aae960131503\") " pod="openshift-must-gather-jkb99/crc-debug-jtfh6" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.622912 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3157eabd-f1a5-4ab2-b3ac-aae960131503-host\") pod \"crc-debug-jtfh6\" (UID: \"3157eabd-f1a5-4ab2-b3ac-aae960131503\") " pod="openshift-must-gather-jkb99/crc-debug-jtfh6" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.623101 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3157eabd-f1a5-4ab2-b3ac-aae960131503-host\") pod \"crc-debug-jtfh6\" (UID: \"3157eabd-f1a5-4ab2-b3ac-aae960131503\") " pod="openshift-must-gather-jkb99/crc-debug-jtfh6" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.641681 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59rxt\" (UniqueName: \"kubernetes.io/projected/3157eabd-f1a5-4ab2-b3ac-aae960131503-kube-api-access-59rxt\") pod \"crc-debug-jtfh6\" (UID: \"3157eabd-f1a5-4ab2-b3ac-aae960131503\") " pod="openshift-must-gather-jkb99/crc-debug-jtfh6" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.763948 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/crc-debug-jtfh6" Mar 13 12:43:02 crc kubenswrapper[4837]: W0313 12:43:02.793242 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3157eabd_f1a5_4ab2_b3ac_aae960131503.slice/crio-8770535a8d77dab7ba0fe6f49814a1c54fbc6bedeb1cfde58bf1f489718da20c WatchSource:0}: Error finding container 8770535a8d77dab7ba0fe6f49814a1c54fbc6bedeb1cfde58bf1f489718da20c: Status 404 returned error can't find the container with id 8770535a8d77dab7ba0fe6f49814a1c54fbc6bedeb1cfde58bf1f489718da20c Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.821463 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/crc-debug-qdfmc" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.822931 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb188d45448e9137888500bafe79ed93c9d6da36a90e2a65f056bd2b9406ffbc" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.829984 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jkb99/crc-debug-jtfh6" event={"ID":"3157eabd-f1a5-4ab2-b3ac-aae960131503","Type":"ContainerStarted","Data":"8770535a8d77dab7ba0fe6f49814a1c54fbc6bedeb1cfde58bf1f489718da20c"} Mar 13 12:43:03 crc kubenswrapper[4837]: I0313 12:43:03.069578 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b157bd4-1b09-44fc-ba60-6b9f3e008253" path="/var/lib/kubelet/pods/8b157bd4-1b09-44fc-ba60-6b9f3e008253/volumes" Mar 13 12:43:03 crc kubenswrapper[4837]: I0313 12:43:03.842331 4837 generic.go:334] "Generic (PLEG): container finished" podID="3157eabd-f1a5-4ab2-b3ac-aae960131503" containerID="344719b3277f9755326094abd259b245489fd00736db03b65759e9e2ad87423a" exitCode=0 Mar 13 12:43:03 crc kubenswrapper[4837]: I0313 12:43:03.842391 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jkb99/crc-debug-jtfh6" event={"ID":"3157eabd-f1a5-4ab2-b3ac-aae960131503","Type":"ContainerDied","Data":"344719b3277f9755326094abd259b245489fd00736db03b65759e9e2ad87423a"} Mar 13 12:43:03 crc kubenswrapper[4837]: I0313 12:43:03.916997 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jkb99/crc-debug-jtfh6"] Mar 13 12:43:03 crc kubenswrapper[4837]: I0313 12:43:03.928117 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jkb99/crc-debug-jtfh6"] Mar 13 12:43:04 crc kubenswrapper[4837]: I0313 12:43:04.970191 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/crc-debug-jtfh6" Mar 13 12:43:05 crc kubenswrapper[4837]: I0313 12:43:05.167990 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3157eabd-f1a5-4ab2-b3ac-aae960131503-host\") pod \"3157eabd-f1a5-4ab2-b3ac-aae960131503\" (UID: \"3157eabd-f1a5-4ab2-b3ac-aae960131503\") " Mar 13 12:43:05 crc kubenswrapper[4837]: I0313 12:43:05.168081 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59rxt\" (UniqueName: \"kubernetes.io/projected/3157eabd-f1a5-4ab2-b3ac-aae960131503-kube-api-access-59rxt\") pod \"3157eabd-f1a5-4ab2-b3ac-aae960131503\" (UID: \"3157eabd-f1a5-4ab2-b3ac-aae960131503\") " Mar 13 12:43:05 crc kubenswrapper[4837]: I0313 12:43:05.168538 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3157eabd-f1a5-4ab2-b3ac-aae960131503-host" (OuterVolumeSpecName: "host") pod "3157eabd-f1a5-4ab2-b3ac-aae960131503" (UID: "3157eabd-f1a5-4ab2-b3ac-aae960131503"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:43:05 crc kubenswrapper[4837]: I0313 12:43:05.168866 4837 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3157eabd-f1a5-4ab2-b3ac-aae960131503-host\") on node \"crc\" DevicePath \"\"" Mar 13 12:43:05 crc kubenswrapper[4837]: I0313 12:43:05.176966 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3157eabd-f1a5-4ab2-b3ac-aae960131503-kube-api-access-59rxt" (OuterVolumeSpecName: "kube-api-access-59rxt") pod "3157eabd-f1a5-4ab2-b3ac-aae960131503" (UID: "3157eabd-f1a5-4ab2-b3ac-aae960131503"). InnerVolumeSpecName "kube-api-access-59rxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:43:05 crc kubenswrapper[4837]: I0313 12:43:05.274219 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59rxt\" (UniqueName: \"kubernetes.io/projected/3157eabd-f1a5-4ab2-b3ac-aae960131503-kube-api-access-59rxt\") on node \"crc\" DevicePath \"\"" Mar 13 12:43:05 crc kubenswrapper[4837]: I0313 12:43:05.860550 4837 scope.go:117] "RemoveContainer" containerID="344719b3277f9755326094abd259b245489fd00736db03b65759e9e2ad87423a" Mar 13 12:43:05 crc kubenswrapper[4837]: I0313 12:43:05.860563 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/crc-debug-jtfh6" Mar 13 12:43:07 crc kubenswrapper[4837]: I0313 12:43:07.067304 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3157eabd-f1a5-4ab2-b3ac-aae960131503" path="/var/lib/kubelet/pods/3157eabd-f1a5-4ab2-b3ac-aae960131503/volumes" Mar 13 12:43:19 crc kubenswrapper[4837]: I0313 12:43:19.130367 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6d84f6b8c8-8rrwq_74c7e377-b579-47bc-a992-cca0cf047627/barbican-api/0.log" Mar 13 12:43:19 crc kubenswrapper[4837]: I0313 12:43:19.315168 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58c489697d-dgjtz_d1cfe08e-23bd-4f52-ab3c-3d68377de2a9/barbican-keystone-listener/0.log" Mar 13 12:43:19 crc kubenswrapper[4837]: I0313 12:43:19.329314 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6d84f6b8c8-8rrwq_74c7e377-b579-47bc-a992-cca0cf047627/barbican-api-log/0.log" Mar 13 12:43:19 crc kubenswrapper[4837]: I0313 12:43:19.390141 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58c489697d-dgjtz_d1cfe08e-23bd-4f52-ab3c-3d68377de2a9/barbican-keystone-listener-log/0.log" Mar 13 12:43:19 crc kubenswrapper[4837]: I0313 12:43:19.516218 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6f4ff9ff9-mjmsz_55084c82-a823-4f31-926e-21702ba02ba1/barbican-worker/0.log" Mar 13 12:43:19 crc kubenswrapper[4837]: I0313 12:43:19.528934 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6f4ff9ff9-mjmsz_55084c82-a823-4f31-926e-21702ba02ba1/barbican-worker-log/0.log" Mar 13 12:43:19 crc kubenswrapper[4837]: I0313 12:43:19.728175 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj_2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:19 crc kubenswrapper[4837]: I0313 12:43:19.773048 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_82b5b509-a674-4a89-a7cc-c01c7bfca144/ceilometer-central-agent/0.log" Mar 13 12:43:19 crc kubenswrapper[4837]: I0313 12:43:19.835328 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_82b5b509-a674-4a89-a7cc-c01c7bfca144/ceilometer-notification-agent/0.log" Mar 13 12:43:19 crc kubenswrapper[4837]: I0313 12:43:19.911966 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_82b5b509-a674-4a89-a7cc-c01c7bfca144/proxy-httpd/0.log" Mar 13 12:43:19 crc kubenswrapper[4837]: I0313 12:43:19.948303 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_82b5b509-a674-4a89-a7cc-c01c7bfca144/sg-core/0.log" Mar 13 12:43:20 crc kubenswrapper[4837]: I0313 12:43:20.062154 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a8004928-50bc-4db8-a701-4458c42bc776/cinder-api/0.log" Mar 13 12:43:20 crc kubenswrapper[4837]: I0313 12:43:20.110544 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a8004928-50bc-4db8-a701-4458c42bc776/cinder-api-log/0.log" Mar 13 12:43:20 crc kubenswrapper[4837]: I0313 12:43:20.202835 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_580b8861-16eb-4142-bd61-6d0221a07f4d/cinder-scheduler/0.log" Mar 13 12:43:20 crc kubenswrapper[4837]: I0313 12:43:20.297673 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_580b8861-16eb-4142-bd61-6d0221a07f4d/probe/0.log" Mar 13 12:43:20 crc kubenswrapper[4837]: I0313 12:43:20.405851 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-s95mk_875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:20 crc kubenswrapper[4837]: I0313 12:43:20.499709 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp_0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:20 crc kubenswrapper[4837]: I0313 12:43:20.593474 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-bxc2t_98f4bdc5-6452-4630-a299-6234d8a63bf8/init/0.log" Mar 13 12:43:20 crc kubenswrapper[4837]: I0313 12:43:20.742923 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-bxc2t_98f4bdc5-6452-4630-a299-6234d8a63bf8/init/0.log" Mar 13 12:43:20 crc kubenswrapper[4837]: I0313 12:43:20.801313 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-bxc2t_98f4bdc5-6452-4630-a299-6234d8a63bf8/dnsmasq-dns/0.log" Mar 13 12:43:20 crc kubenswrapper[4837]: I0313 12:43:20.839553 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts_121f6d1b-1277-4d68-8a48-6c4630dd6fe5/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:20 crc kubenswrapper[4837]: I0313 12:43:20.989548 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d3f87d89-35d5-4dc0-9c37-5297718a9351/glance-httpd/0.log" Mar 13 12:43:21 crc kubenswrapper[4837]: I0313 12:43:21.008175 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d3f87d89-35d5-4dc0-9c37-5297718a9351/glance-log/0.log" Mar 13 12:43:21 crc kubenswrapper[4837]: I0313 12:43:21.171288 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d0f3b003-127f-414f-877a-8f7df2872049/glance-log/0.log" Mar 13 12:43:21 crc kubenswrapper[4837]: I0313 12:43:21.191399 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d0f3b003-127f-414f-877a-8f7df2872049/glance-httpd/0.log" Mar 13 12:43:21 crc kubenswrapper[4837]: I0313 12:43:21.317859 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-fd6ddfd9b-f66l8_4d3df345-07a2-41bf-aae4-088b3ce83b63/horizon/0.log" Mar 13 12:43:21 crc kubenswrapper[4837]: I0313 12:43:21.460595 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-gj59c_6cc8d0dd-d1e6-4374-bb90-aaefc9197350/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:21 crc kubenswrapper[4837]: I0313 12:43:21.625761 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-fd6ddfd9b-f66l8_4d3df345-07a2-41bf-aae4-088b3ce83b63/horizon-log/0.log" Mar 13 12:43:21 crc kubenswrapper[4837]: I0313 12:43:21.743556 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-2b48q_033a02c2-cbe4-4676-ae46-f9b9b17a60fb/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:21 crc kubenswrapper[4837]: I0313 12:43:21.956003 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_abd69ff2-e72e-40c0-925f-d0c1c0a40f9a/kube-state-metrics/0.log" Mar 13 12:43:22 crc kubenswrapper[4837]: I0313 12:43:22.061100 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-55dc4d44f8-mvjvg_9cb9614d-a433-4be3-8145-4c1c8593404f/keystone-api/0.log" Mar 13 12:43:22 crc kubenswrapper[4837]: I0313 12:43:22.241680 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5_394104d4-0291-4071-a7da-d7b71e0f4083/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:22 crc kubenswrapper[4837]: I0313 12:43:22.515994 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-667d547b9-4p8qm_3c00dfc0-061b-43ba-b529-a89c9157a0cf/neutron-api/0.log" Mar 13 12:43:22 crc kubenswrapper[4837]: I0313 12:43:22.584447 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-667d547b9-4p8qm_3c00dfc0-061b-43ba-b529-a89c9157a0cf/neutron-httpd/0.log" Mar 13 12:43:22 crc kubenswrapper[4837]: I0313 12:43:22.722601 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4_20f35066-9c10-4433-a655-f5cef18d4deb/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:23 crc kubenswrapper[4837]: I0313 12:43:23.252208 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4e6cd1d9-f670-4e94-8322-44e471c3be71/nova-api-log/0.log" Mar 13 12:43:23 crc kubenswrapper[4837]: I0313 12:43:23.418935 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4e6cd1d9-f670-4e94-8322-44e471c3be71/nova-api-api/0.log" Mar 13 12:43:23 crc kubenswrapper[4837]: I0313 12:43:23.516453 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_58240a84-c8ab-43a9-8113-eaf2d0ddea2e/nova-cell0-conductor-conductor/0.log" Mar 13 12:43:23 crc kubenswrapper[4837]: I0313 12:43:23.595570 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_9a51debb-c1cb-4a55-b845-e89d89d11e86/nova-cell1-conductor-conductor/0.log" Mar 13 12:43:23 crc kubenswrapper[4837]: I0313 12:43:23.846693 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_662e258d-fe94-4373-912d-c906f1e93c90/nova-cell1-novncproxy-novncproxy/0.log" Mar 13 12:43:23 crc kubenswrapper[4837]: I0313 12:43:23.863815 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-4jdmk_e6986f16-e143-49f4-81e5-58abba717876/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:24 crc kubenswrapper[4837]: I0313 12:43:24.129593 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7faa5418-aa48-4e20-830c-bb171cfea0d9/nova-metadata-log/0.log" Mar 13 12:43:24 crc kubenswrapper[4837]: I0313 12:43:24.266849 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d380e047-7297-4835-b948-6c86c6b6aa27/nova-scheduler-scheduler/0.log" Mar 13 12:43:24 crc kubenswrapper[4837]: I0313 12:43:24.338368 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_362e31d4-ea62-40ed-8426-982d47559472/mysql-bootstrap/0.log" Mar 13 12:43:24 crc kubenswrapper[4837]: I0313 12:43:24.541016 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_362e31d4-ea62-40ed-8426-982d47559472/mysql-bootstrap/0.log" Mar 13 12:43:24 crc kubenswrapper[4837]: I0313 12:43:24.580618 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_362e31d4-ea62-40ed-8426-982d47559472/galera/0.log" Mar 13 12:43:24 crc kubenswrapper[4837]: I0313 12:43:24.706349 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd/mysql-bootstrap/0.log" Mar 13 12:43:25 crc kubenswrapper[4837]: I0313 12:43:25.013667 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd/mysql-bootstrap/0.log" Mar 13 12:43:25 crc kubenswrapper[4837]: I0313 12:43:25.015392 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd/galera/0.log" Mar 13 12:43:25 crc kubenswrapper[4837]: I0313 12:43:25.097413 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7faa5418-aa48-4e20-830c-bb171cfea0d9/nova-metadata-metadata/0.log" Mar 13 12:43:25 crc kubenswrapper[4837]: I0313 12:43:25.183976 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_5d15c820-a2ee-4d4c-986f-2c2f09b43f79/openstackclient/0.log" Mar 13 12:43:25 crc kubenswrapper[4837]: I0313 12:43:25.237876 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-w69p6_18eb496a-7d9f-4bf6-af71-3b7b585d0f7d/openstack-network-exporter/0.log" Mar 13 12:43:25 crc kubenswrapper[4837]: I0313 12:43:25.428827 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-nbhpw_32dc51d9-5638-4530-91c8-5be8c13e60f3/ovn-controller/0.log" Mar 13 12:43:25 crc kubenswrapper[4837]: I0313 12:43:25.519468 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ls998_71e00962-6b2f-495c-8f34-52993f66cef9/ovsdb-server-init/0.log" Mar 13 12:43:25 crc kubenswrapper[4837]: I0313 12:43:25.666212 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ls998_71e00962-6b2f-495c-8f34-52993f66cef9/ovsdb-server-init/0.log" Mar 13 12:43:25 crc kubenswrapper[4837]: I0313 12:43:25.733037 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ls998_71e00962-6b2f-495c-8f34-52993f66cef9/ovsdb-server/0.log" Mar 13 12:43:25 crc kubenswrapper[4837]: I0313 12:43:25.754331 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ls998_71e00962-6b2f-495c-8f34-52993f66cef9/ovs-vswitchd/0.log" Mar 13 12:43:25 crc kubenswrapper[4837]: I0313 12:43:25.945240 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-kbffp_092bd277-504a-450d-aca1-d8ecc18f0c9f/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:25 crc kubenswrapper[4837]: I0313 12:43:25.956452 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_25ea0f5e-e277-4944-8c9d-2c7709e1a8cf/openstack-network-exporter/0.log" Mar 13 12:43:26 crc kubenswrapper[4837]: I0313 12:43:26.058249 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_25ea0f5e-e277-4944-8c9d-2c7709e1a8cf/ovn-northd/0.log" Mar 13 12:43:26 crc kubenswrapper[4837]: I0313 12:43:26.160941 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_38d61ffe-3c44-4657-bc91-d849f766a3e1/openstack-network-exporter/0.log" Mar 13 12:43:26 crc kubenswrapper[4837]: I0313 12:43:26.214464 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_38d61ffe-3c44-4657-bc91-d849f766a3e1/ovsdbserver-nb/0.log" Mar 13 12:43:26 crc kubenswrapper[4837]: I0313 12:43:26.426271 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3d10fcb0-4d45-45bf-a663-971b8ce74010/openstack-network-exporter/0.log" Mar 13 12:43:26 crc kubenswrapper[4837]: I0313 12:43:26.471390 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3d10fcb0-4d45-45bf-a663-971b8ce74010/ovsdbserver-sb/0.log" Mar 13 12:43:26 crc kubenswrapper[4837]: I0313 12:43:26.612710 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-59f7b5dc8d-rnsz6_07eece9e-0e59-4a06-8fea-efb4217d6907/placement-api/0.log" Mar 13 12:43:26 crc kubenswrapper[4837]: I0313 12:43:26.708986 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_90028d66-5134-4c09-af15-71e754f49bf3/setup-container/0.log" Mar 13 12:43:26 crc kubenswrapper[4837]: I0313 12:43:26.734802 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-59f7b5dc8d-rnsz6_07eece9e-0e59-4a06-8fea-efb4217d6907/placement-log/0.log" Mar 13 12:43:26 crc kubenswrapper[4837]: I0313 12:43:26.957153 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_245e5a26-d143-4e4d-bae8-094275a91574/setup-container/0.log" Mar 13 12:43:27 crc kubenswrapper[4837]: I0313 12:43:27.009698 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_90028d66-5134-4c09-af15-71e754f49bf3/rabbitmq/0.log" Mar 13 12:43:27 crc kubenswrapper[4837]: I0313 12:43:27.025473 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_90028d66-5134-4c09-af15-71e754f49bf3/setup-container/0.log" Mar 13 12:43:27 crc kubenswrapper[4837]: I0313 12:43:27.197033 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_245e5a26-d143-4e4d-bae8-094275a91574/setup-container/0.log" Mar 13 12:43:27 crc kubenswrapper[4837]: I0313 12:43:27.229068 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_245e5a26-d143-4e4d-bae8-094275a91574/rabbitmq/0.log" Mar 13 12:43:27 crc kubenswrapper[4837]: I0313 12:43:27.349869 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d_3b96ea7e-2148-4659-9a26-3335c88888c1/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:27 crc kubenswrapper[4837]: I0313 12:43:27.457889 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-pz9nt_0b7402b1-0b76-4ffa-b37f-6e014183f6a6/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:27 crc kubenswrapper[4837]: I0313 12:43:27.571893 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6_bfedd3e5-e8d7-4311-9a0d-30276ce40418/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:27 crc kubenswrapper[4837]: I0313 12:43:27.698573 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-s6jdp_f12ac62a-2011-4e89-a16f-e136959f9d1a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:27 crc kubenswrapper[4837]: I0313 12:43:27.794756 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-vjnpx_4ddcb794-ab03-4308-a93c-c5929ed96e01/ssh-known-hosts-edpm-deployment/0.log" Mar 13 12:43:27 crc kubenswrapper[4837]: I0313 12:43:27.971771 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-bfbc874dc-vsh7q_36ffa543-526d-4d56-b599-06fcfe0988cf/proxy-server/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.009909 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-bfbc874dc-vsh7q_36ffa543-526d-4d56-b599-06fcfe0988cf/proxy-httpd/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.170404 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-69xgx_24998567-afa6-4adc-a503-4fc054946aef/swift-ring-rebalance/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.231395 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/account-auditor/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.259331 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/account-reaper/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.362424 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/account-replicator/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.440783 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/account-server/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.450295 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/container-auditor/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.501908 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/container-replicator/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.603290 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/container-server/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.628649 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/container-updater/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.651978 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/object-auditor/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.730213 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/object-expirer/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.849385 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/object-updater/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.862564 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/object-replicator/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.863297 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/object-server/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.997479 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/rsync/0.log" Mar 13 12:43:29 crc kubenswrapper[4837]: I0313 12:43:29.075263 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/swift-recon-cron/0.log" Mar 13 12:43:29 crc kubenswrapper[4837]: I0313 12:43:29.230881 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x_ac15848f-4f6f-4159-828f-d30a77f93a4b/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:29 crc kubenswrapper[4837]: I0313 12:43:29.305059 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_66bdda91-c5b6-4879-9adf-21846884c797/tempest-tests-tempest-tests-runner/0.log" Mar 13 12:43:29 crc kubenswrapper[4837]: I0313 12:43:29.490215 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_0244acef-b630-4b97-9bb5-9f99de391613/test-operator-logs-container/0.log" Mar 13 12:43:29 crc kubenswrapper[4837]: I0313 12:43:29.540304 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-42br8_e3ec33da-9091-4eb1-aafa-62b9bdf16072/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:39 crc kubenswrapper[4837]: I0313 12:43:39.525118 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_ae39431b-5fa4-4a09-b76f-44b4d256c129/memcached/0.log" Mar 13 12:43:53 crc kubenswrapper[4837]: I0313 12:43:53.608359 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-b7cdx_e645f00a-8463-4fac-b010-f0500b54d68a/manager/0.log" Mar 13 12:43:54 crc kubenswrapper[4837]: I0313 12:43:54.060161 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/util/0.log" Mar 13 12:43:54 crc kubenswrapper[4837]: I0313 12:43:54.279575 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/util/0.log" Mar 13 12:43:54 crc kubenswrapper[4837]: I0313 12:43:54.317497 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/pull/0.log" Mar 13 12:43:54 crc kubenswrapper[4837]: I0313 12:43:54.480526 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/pull/0.log" Mar 13 12:43:54 crc kubenswrapper[4837]: I0313 12:43:54.645233 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/util/0.log" Mar 13 12:43:54 crc kubenswrapper[4837]: I0313 12:43:54.659101 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/pull/0.log" Mar 13 12:43:54 crc kubenswrapper[4837]: I0313 12:43:54.766431 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-kbn8z_0a24601d-8e41-4f99-9e33-870d791a3e7e/manager/0.log" Mar 13 12:43:54 crc kubenswrapper[4837]: I0313 12:43:54.904043 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/extract/0.log" Mar 13 12:43:55 crc kubenswrapper[4837]: I0313 12:43:55.183534 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-mrgb9_1870e3ae-40fd-479c-9aa7-9ce3a3e2dd2e/manager/0.log" Mar 13 12:43:55 crc kubenswrapper[4837]: I0313 12:43:55.639157 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-ss4rm_b2c881d7-03db-4608-a3f4-9a9ad8b2f5da/manager/0.log" Mar 13 12:43:55 crc kubenswrapper[4837]: I0313 12:43:55.670803 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-bvmr7_11a29883-0638-4da4-a1dc-bf2127a3645c/manager/0.log" Mar 13 12:43:56 crc kubenswrapper[4837]: I0313 12:43:56.036292 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-9zvxf_89e6d6f8-7bd3-4862-b41c-cd5c1f05f3e5/manager/0.log" Mar 13 12:43:56 crc kubenswrapper[4837]: I0313 12:43:56.223142 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-fhlk9_c19c3466-ab50-4be3-8299-d7b8b3d263df/manager/0.log" Mar 13 12:43:56 crc kubenswrapper[4837]: I0313 12:43:56.428863 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-kc2x6_9bd066a9-3999-405a-b619-540678a46ded/manager/0.log" Mar 13 12:43:56 crc kubenswrapper[4837]: I0313 12:43:56.474556 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-twrg7_fa1b1ba2-3856-49cb-bda4-8ac5e63b5298/manager/0.log" Mar 13 12:43:56 crc kubenswrapper[4837]: I0313 12:43:56.743231 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-7nm95_046bdee0-f0cf-4d17-916b-68d301502473/manager/0.log" Mar 13 12:43:56 crc kubenswrapper[4837]: I0313 12:43:56.887020 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-6ht9l_3059d7c0-2624-4d3e-af0f-de054401f1ec/manager/0.log" Mar 13 12:43:57 crc kubenswrapper[4837]: I0313 12:43:57.103151 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-shrx7_ee1c592d-7979-4b75-b8e4-7ccd6d7d6048/manager/0.log" Mar 13 12:43:57 crc kubenswrapper[4837]: I0313 12:43:57.187063 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-7f7zd_561aed86-f289-4dd1-8c53-307ccdc99165/manager/0.log" Mar 13 12:43:57 crc kubenswrapper[4837]: I0313 12:43:57.269738 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b77x9vc_7b38159c-e030-4734-963d-dfc38d29c75c/manager/0.log" Mar 13 12:43:57 crc kubenswrapper[4837]: I0313 12:43:57.651498 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-c99df78b8-qxmfb_4f8c5e9e-7680-4bc3-8096-0c62a1de4da5/operator/0.log" Mar 13 12:43:57 crc kubenswrapper[4837]: I0313 12:43:57.841362 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-mdjzs_9da10ec5-aa1b-4797-91ce-04a91266831a/registry-server/0.log" Mar 13 12:43:58 crc kubenswrapper[4837]: I0313 12:43:58.174851 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-nxwr9_5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d/manager/0.log" Mar 13 12:43:58 crc kubenswrapper[4837]: I0313 12:43:58.398881 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-fwblp_35a21ab1-95b5-446a-ae10-d004e5aa2995/manager/0.log" Mar 13 12:43:58 crc kubenswrapper[4837]: I0313 12:43:58.605880 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xkk4z_ce0c89e1-3fc0-473d-875f-461c8b423061/operator/0.log" Mar 13 12:43:58 crc kubenswrapper[4837]: I0313 12:43:58.831659 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-cfv8z_55649f1c-678e-4e03-be55-7c4435446199/manager/0.log" Mar 13 12:43:59 crc kubenswrapper[4837]: I0313 12:43:59.136605 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-8lkmx_cb20db22-bd0e-4897-8ed6-a6a80a91ffff/manager/0.log" Mar 13 12:43:59 crc kubenswrapper[4837]: I0313 12:43:59.206172 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-dk4nr_fe107e39-b5ec-473d-8851-b57775dadafc/manager/0.log" Mar 13 12:43:59 crc kubenswrapper[4837]: I0313 12:43:59.247903 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-55876d85bb-96mp7_eaf3fa29-f441-43df-9fbe-409d9d8ad871/manager/0.log" Mar 13 12:43:59 crc kubenswrapper[4837]: I0313 12:43:59.334302 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-hrcp9_5ef20b1d-5c03-4993-b635-b031ddcab3bf/manager/0.log" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.032335 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-jvdqq_1d59bb7f-598d-4c70-9b8c-ce4e3048691f/manager/0.log" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.146251 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556764-r7n49"] Mar 13 12:44:00 crc kubenswrapper[4837]: E0313 12:44:00.150092 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3157eabd-f1a5-4ab2-b3ac-aae960131503" containerName="container-00" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.150112 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="3157eabd-f1a5-4ab2-b3ac-aae960131503" containerName="container-00" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.150295 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="3157eabd-f1a5-4ab2-b3ac-aae960131503" containerName="container-00" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.150960 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556764-r7n49" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.153607 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.153741 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.153760 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.162324 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556764-r7n49"] Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.300365 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7db7\" (UniqueName: \"kubernetes.io/projected/7e88464a-8619-4750-ac96-b1ad569fcece-kube-api-access-d7db7\") pod \"auto-csr-approver-29556764-r7n49\" (UID: \"7e88464a-8619-4750-ac96-b1ad569fcece\") " pod="openshift-infra/auto-csr-approver-29556764-r7n49" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.402903 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7db7\" (UniqueName: \"kubernetes.io/projected/7e88464a-8619-4750-ac96-b1ad569fcece-kube-api-access-d7db7\") pod \"auto-csr-approver-29556764-r7n49\" (UID: \"7e88464a-8619-4750-ac96-b1ad569fcece\") " pod="openshift-infra/auto-csr-approver-29556764-r7n49" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.423899 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7db7\" (UniqueName: \"kubernetes.io/projected/7e88464a-8619-4750-ac96-b1ad569fcece-kube-api-access-d7db7\") pod \"auto-csr-approver-29556764-r7n49\" (UID: \"7e88464a-8619-4750-ac96-b1ad569fcece\") " pod="openshift-infra/auto-csr-approver-29556764-r7n49" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.476596 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556764-r7n49" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.950785 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556764-r7n49"] Mar 13 12:44:00 crc kubenswrapper[4837]: W0313 12:44:00.952441 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e88464a_8619_4750_ac96_b1ad569fcece.slice/crio-bfb4f6797f10f32b7ae2c1839830c41e8293f02b1b5b7739f3c42113f69cd41e WatchSource:0}: Error finding container bfb4f6797f10f32b7ae2c1839830c41e8293f02b1b5b7739f3c42113f69cd41e: Status 404 returned error can't find the container with id bfb4f6797f10f32b7ae2c1839830c41e8293f02b1b5b7739f3c42113f69cd41e Mar 13 12:44:01 crc kubenswrapper[4837]: I0313 12:44:01.495953 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556764-r7n49" event={"ID":"7e88464a-8619-4750-ac96-b1ad569fcece","Type":"ContainerStarted","Data":"bfb4f6797f10f32b7ae2c1839830c41e8293f02b1b5b7739f3c42113f69cd41e"} Mar 13 12:44:02 crc kubenswrapper[4837]: I0313 12:44:02.504959 4837 generic.go:334] "Generic (PLEG): container finished" podID="7e88464a-8619-4750-ac96-b1ad569fcece" containerID="1dc84242c71f8e5d31bcd05b0ae44aeb29c8a625295bbab7f2eb79c610ba55a4" exitCode=0 Mar 13 12:44:02 crc kubenswrapper[4837]: I0313 12:44:02.505078 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556764-r7n49" event={"ID":"7e88464a-8619-4750-ac96-b1ad569fcece","Type":"ContainerDied","Data":"1dc84242c71f8e5d31bcd05b0ae44aeb29c8a625295bbab7f2eb79c610ba55a4"} Mar 13 12:44:03 crc kubenswrapper[4837]: I0313 12:44:03.918843 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556764-r7n49" Mar 13 12:44:03 crc kubenswrapper[4837]: I0313 12:44:03.971335 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7db7\" (UniqueName: \"kubernetes.io/projected/7e88464a-8619-4750-ac96-b1ad569fcece-kube-api-access-d7db7\") pod \"7e88464a-8619-4750-ac96-b1ad569fcece\" (UID: \"7e88464a-8619-4750-ac96-b1ad569fcece\") " Mar 13 12:44:03 crc kubenswrapper[4837]: I0313 12:44:03.977868 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e88464a-8619-4750-ac96-b1ad569fcece-kube-api-access-d7db7" (OuterVolumeSpecName: "kube-api-access-d7db7") pod "7e88464a-8619-4750-ac96-b1ad569fcece" (UID: "7e88464a-8619-4750-ac96-b1ad569fcece"). InnerVolumeSpecName "kube-api-access-d7db7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:44:04 crc kubenswrapper[4837]: I0313 12:44:04.074192 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7db7\" (UniqueName: \"kubernetes.io/projected/7e88464a-8619-4750-ac96-b1ad569fcece-kube-api-access-d7db7\") on node \"crc\" DevicePath \"\"" Mar 13 12:44:04 crc kubenswrapper[4837]: I0313 12:44:04.522940 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556764-r7n49" event={"ID":"7e88464a-8619-4750-ac96-b1ad569fcece","Type":"ContainerDied","Data":"bfb4f6797f10f32b7ae2c1839830c41e8293f02b1b5b7739f3c42113f69cd41e"} Mar 13 12:44:04 crc kubenswrapper[4837]: I0313 12:44:04.522982 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfb4f6797f10f32b7ae2c1839830c41e8293f02b1b5b7739f3c42113f69cd41e" Mar 13 12:44:04 crc kubenswrapper[4837]: I0313 12:44:04.523010 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556764-r7n49" Mar 13 12:44:04 crc kubenswrapper[4837]: I0313 12:44:04.982727 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556758-srcvt"] Mar 13 12:44:04 crc kubenswrapper[4837]: I0313 12:44:04.990264 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556758-srcvt"] Mar 13 12:44:05 crc kubenswrapper[4837]: I0313 12:44:05.064274 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38" path="/var/lib/kubelet/pods/0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38/volumes" Mar 13 12:44:17 crc kubenswrapper[4837]: I0313 12:44:17.715364 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-jrm5t_00848ba6-522a-45c7-81bd-7ab287d77626/control-plane-machine-set-operator/0.log" Mar 13 12:44:17 crc kubenswrapper[4837]: I0313 12:44:17.835061 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vsp2m_6db10103-96be-4420-b302-a7064e347f61/kube-rbac-proxy/0.log" Mar 13 12:44:17 crc kubenswrapper[4837]: I0313 12:44:17.909835 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vsp2m_6db10103-96be-4420-b302-a7064e347f61/machine-api-operator/0.log" Mar 13 12:44:29 crc kubenswrapper[4837]: I0313 12:44:29.433657 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-dlspp_5ecc1237-3421-41d5-8efb-a62399ae1d73/cert-manager-controller/0.log" Mar 13 12:44:29 crc kubenswrapper[4837]: I0313 12:44:29.636987 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-xzv5h_67507b8e-35d5-4dff-9239-45b5ef997e53/cert-manager-cainjector/0.log" Mar 13 12:44:29 crc kubenswrapper[4837]: I0313 12:44:29.659199 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-ht9vn_0e500b82-1f14-4a1e-937d-00248f195033/cert-manager-webhook/0.log" Mar 13 12:44:32 crc kubenswrapper[4837]: I0313 12:44:32.605000 4837 scope.go:117] "RemoveContainer" containerID="62169da6c39018c4d64900197bc422e10f99368271388e87ca1a65e2ba0fb126" Mar 13 12:44:42 crc kubenswrapper[4837]: I0313 12:44:42.639269 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-fpxmr_00b31b3f-b520-493a-ad26-679e09376e81/nmstate-console-plugin/0.log" Mar 13 12:44:42 crc kubenswrapper[4837]: I0313 12:44:42.969696 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-vqqqz_ebe31727-805d-472e-89d3-e99b11435be1/nmstate-handler/0.log" Mar 13 12:44:43 crc kubenswrapper[4837]: I0313 12:44:43.074711 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-8xzdk_5d1f2d02-86ab-4679-a4e4-530ad37e4302/nmstate-metrics/0.log" Mar 13 12:44:43 crc kubenswrapper[4837]: I0313 12:44:43.087400 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-8xzdk_5d1f2d02-86ab-4679-a4e4-530ad37e4302/kube-rbac-proxy/0.log" Mar 13 12:44:43 crc kubenswrapper[4837]: I0313 12:44:43.234113 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-zf78q_ef7096b9-861a-4889-9318-535c35151777/nmstate-operator/0.log" Mar 13 12:44:43 crc kubenswrapper[4837]: I0313 12:44:43.317730 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-6cx5h_0b06c77a-f41d-41a6-b115-f12cc5109c0c/nmstate-webhook/0.log" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.143893 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g"] Mar 13 12:45:00 crc kubenswrapper[4837]: E0313 12:45:00.144757 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e88464a-8619-4750-ac96-b1ad569fcece" containerName="oc" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.144770 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e88464a-8619-4750-ac96-b1ad569fcece" containerName="oc" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.144957 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e88464a-8619-4750-ac96-b1ad569fcece" containerName="oc" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.145618 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.147435 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.147560 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.165966 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g"] Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.256540 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e6600de-c059-4b00-bb32-d08fa205af0b-secret-volume\") pod \"collect-profiles-29556765-tjz7g\" (UID: \"5e6600de-c059-4b00-bb32-d08fa205af0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.256582 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74z7h\" (UniqueName: \"kubernetes.io/projected/5e6600de-c059-4b00-bb32-d08fa205af0b-kube-api-access-74z7h\") pod \"collect-profiles-29556765-tjz7g\" (UID: \"5e6600de-c059-4b00-bb32-d08fa205af0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.256599 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e6600de-c059-4b00-bb32-d08fa205af0b-config-volume\") pod \"collect-profiles-29556765-tjz7g\" (UID: \"5e6600de-c059-4b00-bb32-d08fa205af0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.359024 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e6600de-c059-4b00-bb32-d08fa205af0b-secret-volume\") pod \"collect-profiles-29556765-tjz7g\" (UID: \"5e6600de-c059-4b00-bb32-d08fa205af0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.359068 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e6600de-c059-4b00-bb32-d08fa205af0b-config-volume\") pod \"collect-profiles-29556765-tjz7g\" (UID: \"5e6600de-c059-4b00-bb32-d08fa205af0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.359087 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74z7h\" (UniqueName: \"kubernetes.io/projected/5e6600de-c059-4b00-bb32-d08fa205af0b-kube-api-access-74z7h\") pod \"collect-profiles-29556765-tjz7g\" (UID: \"5e6600de-c059-4b00-bb32-d08fa205af0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.359860 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e6600de-c059-4b00-bb32-d08fa205af0b-config-volume\") pod \"collect-profiles-29556765-tjz7g\" (UID: \"5e6600de-c059-4b00-bb32-d08fa205af0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.366157 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e6600de-c059-4b00-bb32-d08fa205af0b-secret-volume\") pod \"collect-profiles-29556765-tjz7g\" (UID: \"5e6600de-c059-4b00-bb32-d08fa205af0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.385238 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74z7h\" (UniqueName: \"kubernetes.io/projected/5e6600de-c059-4b00-bb32-d08fa205af0b-kube-api-access-74z7h\") pod \"collect-profiles-29556765-tjz7g\" (UID: \"5e6600de-c059-4b00-bb32-d08fa205af0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.505075 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.997861 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g"] Mar 13 12:45:01 crc kubenswrapper[4837]: I0313 12:45:01.020951 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" event={"ID":"5e6600de-c059-4b00-bb32-d08fa205af0b","Type":"ContainerStarted","Data":"9223edefc813554c07fd84bc832d210bc6d60ad4f95d18bbbb21c5caf7d7c599"} Mar 13 12:45:02 crc kubenswrapper[4837]: I0313 12:45:02.035597 4837 generic.go:334] "Generic (PLEG): container finished" podID="5e6600de-c059-4b00-bb32-d08fa205af0b" containerID="757048523c491e6eb74ea7ed6665cc66421c95e45bb0822633b5750e9b571caa" exitCode=0 Mar 13 12:45:02 crc kubenswrapper[4837]: I0313 12:45:02.035681 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" event={"ID":"5e6600de-c059-4b00-bb32-d08fa205af0b","Type":"ContainerDied","Data":"757048523c491e6eb74ea7ed6665cc66421c95e45bb0822633b5750e9b571caa"} Mar 13 12:45:03 crc kubenswrapper[4837]: I0313 12:45:03.430983 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:03 crc kubenswrapper[4837]: I0313 12:45:03.515645 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e6600de-c059-4b00-bb32-d08fa205af0b-secret-volume\") pod \"5e6600de-c059-4b00-bb32-d08fa205af0b\" (UID: \"5e6600de-c059-4b00-bb32-d08fa205af0b\") " Mar 13 12:45:03 crc kubenswrapper[4837]: I0313 12:45:03.516062 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74z7h\" (UniqueName: \"kubernetes.io/projected/5e6600de-c059-4b00-bb32-d08fa205af0b-kube-api-access-74z7h\") pod \"5e6600de-c059-4b00-bb32-d08fa205af0b\" (UID: \"5e6600de-c059-4b00-bb32-d08fa205af0b\") " Mar 13 12:45:03 crc kubenswrapper[4837]: I0313 12:45:03.516099 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e6600de-c059-4b00-bb32-d08fa205af0b-config-volume\") pod \"5e6600de-c059-4b00-bb32-d08fa205af0b\" (UID: \"5e6600de-c059-4b00-bb32-d08fa205af0b\") " Mar 13 12:45:03 crc kubenswrapper[4837]: I0313 12:45:03.517254 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e6600de-c059-4b00-bb32-d08fa205af0b-config-volume" (OuterVolumeSpecName: "config-volume") pod "5e6600de-c059-4b00-bb32-d08fa205af0b" (UID: "5e6600de-c059-4b00-bb32-d08fa205af0b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:45:03 crc kubenswrapper[4837]: I0313 12:45:03.523804 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e6600de-c059-4b00-bb32-d08fa205af0b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5e6600de-c059-4b00-bb32-d08fa205af0b" (UID: "5e6600de-c059-4b00-bb32-d08fa205af0b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:45:03 crc kubenswrapper[4837]: I0313 12:45:03.528009 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e6600de-c059-4b00-bb32-d08fa205af0b-kube-api-access-74z7h" (OuterVolumeSpecName: "kube-api-access-74z7h") pod "5e6600de-c059-4b00-bb32-d08fa205af0b" (UID: "5e6600de-c059-4b00-bb32-d08fa205af0b"). InnerVolumeSpecName "kube-api-access-74z7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:45:03 crc kubenswrapper[4837]: I0313 12:45:03.618202 4837 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e6600de-c059-4b00-bb32-d08fa205af0b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 12:45:03 crc kubenswrapper[4837]: I0313 12:45:03.618240 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74z7h\" (UniqueName: \"kubernetes.io/projected/5e6600de-c059-4b00-bb32-d08fa205af0b-kube-api-access-74z7h\") on node \"crc\" DevicePath \"\"" Mar 13 12:45:03 crc kubenswrapper[4837]: I0313 12:45:03.618249 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e6600de-c059-4b00-bb32-d08fa205af0b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 12:45:04 crc kubenswrapper[4837]: I0313 12:45:04.056827 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" event={"ID":"5e6600de-c059-4b00-bb32-d08fa205af0b","Type":"ContainerDied","Data":"9223edefc813554c07fd84bc832d210bc6d60ad4f95d18bbbb21c5caf7d7c599"} Mar 13 12:45:04 crc kubenswrapper[4837]: I0313 12:45:04.056867 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9223edefc813554c07fd84bc832d210bc6d60ad4f95d18bbbb21c5caf7d7c599" Mar 13 12:45:04 crc kubenswrapper[4837]: I0313 12:45:04.056900 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:04 crc kubenswrapper[4837]: I0313 12:45:04.515054 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc"] Mar 13 12:45:04 crc kubenswrapper[4837]: I0313 12:45:04.530031 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc"] Mar 13 12:45:05 crc kubenswrapper[4837]: I0313 12:45:05.083776 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6d18151-32fe-4457-814f-33c3ed53dab8" path="/var/lib/kubelet/pods/a6d18151-32fe-4457-814f-33c3ed53dab8/volumes" Mar 13 12:45:05 crc kubenswrapper[4837]: I0313 12:45:05.484363 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:45:05 crc kubenswrapper[4837]: I0313 12:45:05.484432 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:45:11 crc kubenswrapper[4837]: I0313 12:45:11.962239 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-zm9dj_0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88/kube-rbac-proxy/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.117858 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-zm9dj_0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88/controller/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.158252 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-frr-files/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.348967 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-frr-files/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.399098 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-reloader/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.402526 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-metrics/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.426233 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-reloader/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.599351 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-metrics/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.607236 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-frr-files/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.623742 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-metrics/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.658687 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-reloader/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.812766 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-frr-files/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.846316 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-reloader/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.861464 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-metrics/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.877546 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/controller/0.log" Mar 13 12:45:13 crc kubenswrapper[4837]: I0313 12:45:13.051612 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/frr-metrics/0.log" Mar 13 12:45:13 crc kubenswrapper[4837]: I0313 12:45:13.077114 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/kube-rbac-proxy/0.log" Mar 13 12:45:13 crc kubenswrapper[4837]: I0313 12:45:13.121121 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/kube-rbac-proxy-frr/0.log" Mar 13 12:45:13 crc kubenswrapper[4837]: I0313 12:45:13.263294 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/reloader/0.log" Mar 13 12:45:13 crc kubenswrapper[4837]: I0313 12:45:13.357026 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-jwgl7_c72405c5-2c81-43f4-93c6-f73f9771be8b/frr-k8s-webhook-server/0.log" Mar 13 12:45:13 crc kubenswrapper[4837]: I0313 12:45:13.852338 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-dcfbdf95f-7x96d_41898fd8-d078-444c-bb55-33f4fb6f3dcc/manager/0.log" Mar 13 12:45:14 crc kubenswrapper[4837]: I0313 12:45:14.031893 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-59b847b88-lrvzm_eabfad13-4fe4-495d-8b6a-2da56ef3b826/webhook-server/0.log" Mar 13 12:45:14 crc kubenswrapper[4837]: I0313 12:45:14.064935 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8skdh_82a5fe00-90be-47b1-a357-69942f385d4f/kube-rbac-proxy/0.log" Mar 13 12:45:14 crc kubenswrapper[4837]: I0313 12:45:14.362713 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/frr/0.log" Mar 13 12:45:14 crc kubenswrapper[4837]: I0313 12:45:14.519462 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8skdh_82a5fe00-90be-47b1-a357-69942f385d4f/speaker/0.log" Mar 13 12:45:26 crc kubenswrapper[4837]: I0313 12:45:26.643094 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/util/0.log" Mar 13 12:45:26 crc kubenswrapper[4837]: I0313 12:45:26.815722 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/util/0.log" Mar 13 12:45:26 crc kubenswrapper[4837]: I0313 12:45:26.841685 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/pull/0.log" Mar 13 12:45:26 crc kubenswrapper[4837]: I0313 12:45:26.889629 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/pull/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.031854 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/extract/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.078882 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/pull/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.095510 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/util/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.214937 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/util/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.410256 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/pull/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.415478 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/util/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.416597 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/pull/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.603700 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/util/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.611234 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/extract/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.616788 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/pull/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.778677 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/extract-utilities/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.921675 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/extract-utilities/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.943356 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/extract-content/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.952820 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/extract-content/0.log" Mar 13 12:45:28 crc kubenswrapper[4837]: I0313 12:45:28.125943 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/extract-utilities/0.log" Mar 13 12:45:28 crc kubenswrapper[4837]: I0313 12:45:28.132627 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/extract-content/0.log" Mar 13 12:45:28 crc kubenswrapper[4837]: I0313 12:45:28.406863 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/extract-utilities/0.log" Mar 13 12:45:28 crc kubenswrapper[4837]: I0313 12:45:28.546594 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/extract-utilities/0.log" Mar 13 12:45:28 crc kubenswrapper[4837]: I0313 12:45:28.594217 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/registry-server/0.log" Mar 13 12:45:28 crc kubenswrapper[4837]: I0313 12:45:28.605373 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/extract-content/0.log" Mar 13 12:45:28 crc kubenswrapper[4837]: I0313 12:45:28.643455 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/extract-content/0.log" Mar 13 12:45:28 crc kubenswrapper[4837]: I0313 12:45:28.805250 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/extract-utilities/0.log" Mar 13 12:45:28 crc kubenswrapper[4837]: I0313 12:45:28.846274 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/extract-content/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.034311 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-7rzpc_b87c8f86-a346-4907-9441-048c3220646f/marketplace-operator/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.152468 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/extract-utilities/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.347357 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/extract-utilities/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.358313 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/registry-server/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.396476 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/extract-content/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.427425 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/extract-content/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.566230 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/extract-utilities/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.571197 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/extract-content/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.698088 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/registry-server/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.766756 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/extract-utilities/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.950144 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/extract-content/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.951421 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/extract-utilities/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.979868 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/extract-content/0.log" Mar 13 12:45:30 crc kubenswrapper[4837]: I0313 12:45:30.119262 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/extract-utilities/0.log" Mar 13 12:45:30 crc kubenswrapper[4837]: I0313 12:45:30.204952 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/extract-content/0.log" Mar 13 12:45:30 crc kubenswrapper[4837]: I0313 12:45:30.704703 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/registry-server/0.log" Mar 13 12:45:32 crc kubenswrapper[4837]: I0313 12:45:32.681278 4837 scope.go:117] "RemoveContainer" containerID="2d2bfd751903359f1fbdf915afe9614d288e33b823b0215d4cd3578202f69f1c" Mar 13 12:45:35 crc kubenswrapper[4837]: I0313 12:45:35.484120 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:45:35 crc kubenswrapper[4837]: I0313 12:45:35.484420 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.145813 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556766-lgkks"] Mar 13 12:46:00 crc kubenswrapper[4837]: E0313 12:46:00.146675 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e6600de-c059-4b00-bb32-d08fa205af0b" containerName="collect-profiles" Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.146686 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e6600de-c059-4b00-bb32-d08fa205af0b" containerName="collect-profiles" Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.146925 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e6600de-c059-4b00-bb32-d08fa205af0b" containerName="collect-profiles" Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.147710 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556766-lgkks" Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.150554 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.157380 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556766-lgkks"] Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.190282 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.191106 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.303128 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbx56\" (UniqueName: \"kubernetes.io/projected/528b7541-8fac-4df9-9168-c3166532618d-kube-api-access-bbx56\") pod \"auto-csr-approver-29556766-lgkks\" (UID: \"528b7541-8fac-4df9-9168-c3166532618d\") " pod="openshift-infra/auto-csr-approver-29556766-lgkks" Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.404817 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbx56\" (UniqueName: \"kubernetes.io/projected/528b7541-8fac-4df9-9168-c3166532618d-kube-api-access-bbx56\") pod \"auto-csr-approver-29556766-lgkks\" (UID: \"528b7541-8fac-4df9-9168-c3166532618d\") " pod="openshift-infra/auto-csr-approver-29556766-lgkks" Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.426473 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbx56\" (UniqueName: \"kubernetes.io/projected/528b7541-8fac-4df9-9168-c3166532618d-kube-api-access-bbx56\") pod \"auto-csr-approver-29556766-lgkks\" (UID: \"528b7541-8fac-4df9-9168-c3166532618d\") " pod="openshift-infra/auto-csr-approver-29556766-lgkks" Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.508000 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556766-lgkks" Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.966994 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556766-lgkks"] Mar 13 12:46:01 crc kubenswrapper[4837]: I0313 12:46:01.580678 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556766-lgkks" event={"ID":"528b7541-8fac-4df9-9168-c3166532618d","Type":"ContainerStarted","Data":"15fdf260725a5dcb59db67a57200a80bab5a12f32fd0ab8f574e699cb6938eb0"} Mar 13 12:46:02 crc kubenswrapper[4837]: I0313 12:46:02.589711 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556766-lgkks" event={"ID":"528b7541-8fac-4df9-9168-c3166532618d","Type":"ContainerDied","Data":"ef54f8788c02611298b8e701f2aadf7ddd17abb3f2f5d777925238c78d7d9c68"} Mar 13 12:46:02 crc kubenswrapper[4837]: I0313 12:46:02.589617 4837 generic.go:334] "Generic (PLEG): container finished" podID="528b7541-8fac-4df9-9168-c3166532618d" containerID="ef54f8788c02611298b8e701f2aadf7ddd17abb3f2f5d777925238c78d7d9c68" exitCode=0 Mar 13 12:46:03 crc kubenswrapper[4837]: I0313 12:46:03.983834 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556766-lgkks" Mar 13 12:46:04 crc kubenswrapper[4837]: I0313 12:46:04.073581 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbx56\" (UniqueName: \"kubernetes.io/projected/528b7541-8fac-4df9-9168-c3166532618d-kube-api-access-bbx56\") pod \"528b7541-8fac-4df9-9168-c3166532618d\" (UID: \"528b7541-8fac-4df9-9168-c3166532618d\") " Mar 13 12:46:04 crc kubenswrapper[4837]: I0313 12:46:04.080082 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/528b7541-8fac-4df9-9168-c3166532618d-kube-api-access-bbx56" (OuterVolumeSpecName: "kube-api-access-bbx56") pod "528b7541-8fac-4df9-9168-c3166532618d" (UID: "528b7541-8fac-4df9-9168-c3166532618d"). InnerVolumeSpecName "kube-api-access-bbx56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:46:04 crc kubenswrapper[4837]: I0313 12:46:04.179828 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbx56\" (UniqueName: \"kubernetes.io/projected/528b7541-8fac-4df9-9168-c3166532618d-kube-api-access-bbx56\") on node \"crc\" DevicePath \"\"" Mar 13 12:46:04 crc kubenswrapper[4837]: I0313 12:46:04.609809 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556766-lgkks" event={"ID":"528b7541-8fac-4df9-9168-c3166532618d","Type":"ContainerDied","Data":"15fdf260725a5dcb59db67a57200a80bab5a12f32fd0ab8f574e699cb6938eb0"} Mar 13 12:46:04 crc kubenswrapper[4837]: I0313 12:46:04.609850 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15fdf260725a5dcb59db67a57200a80bab5a12f32fd0ab8f574e699cb6938eb0" Mar 13 12:46:04 crc kubenswrapper[4837]: I0313 12:46:04.610534 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556766-lgkks" Mar 13 12:46:05 crc kubenswrapper[4837]: I0313 12:46:05.071339 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556760-zcnxn"] Mar 13 12:46:05 crc kubenswrapper[4837]: I0313 12:46:05.073511 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556760-zcnxn"] Mar 13 12:46:05 crc kubenswrapper[4837]: I0313 12:46:05.483580 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:46:05 crc kubenswrapper[4837]: I0313 12:46:05.483934 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:46:05 crc kubenswrapper[4837]: I0313 12:46:05.484000 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 12:46:05 crc kubenswrapper[4837]: I0313 12:46:05.484826 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"42895e7497f11e52c6189b0227f4673591fee559ac68adfdd28355562f8112bd"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:46:05 crc kubenswrapper[4837]: I0313 12:46:05.484902 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://42895e7497f11e52c6189b0227f4673591fee559ac68adfdd28355562f8112bd" gracePeriod=600 Mar 13 12:46:05 crc kubenswrapper[4837]: I0313 12:46:05.626057 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="42895e7497f11e52c6189b0227f4673591fee559ac68adfdd28355562f8112bd" exitCode=0 Mar 13 12:46:05 crc kubenswrapper[4837]: I0313 12:46:05.626116 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"42895e7497f11e52c6189b0227f4673591fee559ac68adfdd28355562f8112bd"} Mar 13 12:46:05 crc kubenswrapper[4837]: I0313 12:46:05.626153 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:46:06 crc kubenswrapper[4837]: I0313 12:46:06.635552 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0"} Mar 13 12:46:07 crc kubenswrapper[4837]: I0313 12:46:07.062848 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a273cb74-6dcc-4e87-8f25-db5c77132250" path="/var/lib/kubelet/pods/a273cb74-6dcc-4e87-8f25-db5c77132250/volumes" Mar 13 12:46:32 crc kubenswrapper[4837]: I0313 12:46:32.746757 4837 scope.go:117] "RemoveContainer" containerID="da2f9878f57615785241ef1796e14de81a297b17cfd5ebaf3f55711c66c5482b" Mar 13 12:47:18 crc kubenswrapper[4837]: I0313 12:47:18.315730 4837 generic.go:334] "Generic (PLEG): container finished" podID="8822de14-eaa5-4016-91fd-611718d9b51a" containerID="46bee07e0cc64861f34813174541cff76485e4b8cd9b5fb84ab93fd9eff59fed" exitCode=0 Mar 13 12:47:18 crc kubenswrapper[4837]: I0313 12:47:18.315948 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jkb99/must-gather-4lckb" event={"ID":"8822de14-eaa5-4016-91fd-611718d9b51a","Type":"ContainerDied","Data":"46bee07e0cc64861f34813174541cff76485e4b8cd9b5fb84ab93fd9eff59fed"} Mar 13 12:47:18 crc kubenswrapper[4837]: I0313 12:47:18.316899 4837 scope.go:117] "RemoveContainer" containerID="46bee07e0cc64861f34813174541cff76485e4b8cd9b5fb84ab93fd9eff59fed" Mar 13 12:47:18 crc kubenswrapper[4837]: I0313 12:47:18.912735 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jkb99_must-gather-4lckb_8822de14-eaa5-4016-91fd-611718d9b51a/gather/0.log" Mar 13 12:47:27 crc kubenswrapper[4837]: I0313 12:47:27.259454 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jkb99/must-gather-4lckb"] Mar 13 12:47:27 crc kubenswrapper[4837]: I0313 12:47:27.260330 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-jkb99/must-gather-4lckb" podUID="8822de14-eaa5-4016-91fd-611718d9b51a" containerName="copy" containerID="cri-o://d7374ab200a788a99f53fe2448f4035d1be2d4984c27b0031e0578210408765b" gracePeriod=2 Mar 13 12:47:27 crc kubenswrapper[4837]: I0313 12:47:27.269480 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jkb99/must-gather-4lckb"] Mar 13 12:47:27 crc kubenswrapper[4837]: I0313 12:47:27.402910 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jkb99_must-gather-4lckb_8822de14-eaa5-4016-91fd-611718d9b51a/copy/0.log" Mar 13 12:47:27 crc kubenswrapper[4837]: I0313 12:47:27.403207 4837 generic.go:334] "Generic (PLEG): container finished" podID="8822de14-eaa5-4016-91fd-611718d9b51a" containerID="d7374ab200a788a99f53fe2448f4035d1be2d4984c27b0031e0578210408765b" exitCode=143 Mar 13 12:47:27 crc kubenswrapper[4837]: I0313 12:47:27.770703 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jkb99_must-gather-4lckb_8822de14-eaa5-4016-91fd-611718d9b51a/copy/0.log" Mar 13 12:47:27 crc kubenswrapper[4837]: I0313 12:47:27.771320 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/must-gather-4lckb" Mar 13 12:47:27 crc kubenswrapper[4837]: I0313 12:47:27.801345 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8822de14-eaa5-4016-91fd-611718d9b51a-must-gather-output\") pod \"8822de14-eaa5-4016-91fd-611718d9b51a\" (UID: \"8822de14-eaa5-4016-91fd-611718d9b51a\") " Mar 13 12:47:27 crc kubenswrapper[4837]: I0313 12:47:27.801406 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbzbt\" (UniqueName: \"kubernetes.io/projected/8822de14-eaa5-4016-91fd-611718d9b51a-kube-api-access-zbzbt\") pod \"8822de14-eaa5-4016-91fd-611718d9b51a\" (UID: \"8822de14-eaa5-4016-91fd-611718d9b51a\") " Mar 13 12:47:27 crc kubenswrapper[4837]: I0313 12:47:27.807519 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8822de14-eaa5-4016-91fd-611718d9b51a-kube-api-access-zbzbt" (OuterVolumeSpecName: "kube-api-access-zbzbt") pod "8822de14-eaa5-4016-91fd-611718d9b51a" (UID: "8822de14-eaa5-4016-91fd-611718d9b51a"). InnerVolumeSpecName "kube-api-access-zbzbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:47:27 crc kubenswrapper[4837]: I0313 12:47:27.904599 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbzbt\" (UniqueName: \"kubernetes.io/projected/8822de14-eaa5-4016-91fd-611718d9b51a-kube-api-access-zbzbt\") on node \"crc\" DevicePath \"\"" Mar 13 12:47:27 crc kubenswrapper[4837]: I0313 12:47:27.963904 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8822de14-eaa5-4016-91fd-611718d9b51a-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "8822de14-eaa5-4016-91fd-611718d9b51a" (UID: "8822de14-eaa5-4016-91fd-611718d9b51a"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:47:28 crc kubenswrapper[4837]: I0313 12:47:28.006855 4837 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8822de14-eaa5-4016-91fd-611718d9b51a-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 13 12:47:28 crc kubenswrapper[4837]: I0313 12:47:28.414027 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jkb99_must-gather-4lckb_8822de14-eaa5-4016-91fd-611718d9b51a/copy/0.log" Mar 13 12:47:28 crc kubenswrapper[4837]: I0313 12:47:28.416334 4837 scope.go:117] "RemoveContainer" containerID="d7374ab200a788a99f53fe2448f4035d1be2d4984c27b0031e0578210408765b" Mar 13 12:47:28 crc kubenswrapper[4837]: I0313 12:47:28.416439 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/must-gather-4lckb" Mar 13 12:47:28 crc kubenswrapper[4837]: I0313 12:47:28.454900 4837 scope.go:117] "RemoveContainer" containerID="46bee07e0cc64861f34813174541cff76485e4b8cd9b5fb84ab93fd9eff59fed" Mar 13 12:47:29 crc kubenswrapper[4837]: I0313 12:47:29.064847 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8822de14-eaa5-4016-91fd-611718d9b51a" path="/var/lib/kubelet/pods/8822de14-eaa5-4016-91fd-611718d9b51a/volumes" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.147733 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556768-f66sk"] Mar 13 12:48:00 crc kubenswrapper[4837]: E0313 12:48:00.148677 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8822de14-eaa5-4016-91fd-611718d9b51a" containerName="copy" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.148693 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8822de14-eaa5-4016-91fd-611718d9b51a" containerName="copy" Mar 13 12:48:00 crc kubenswrapper[4837]: E0313 12:48:00.148724 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="528b7541-8fac-4df9-9168-c3166532618d" containerName="oc" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.148732 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="528b7541-8fac-4df9-9168-c3166532618d" containerName="oc" Mar 13 12:48:00 crc kubenswrapper[4837]: E0313 12:48:00.148748 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8822de14-eaa5-4016-91fd-611718d9b51a" containerName="gather" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.148756 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8822de14-eaa5-4016-91fd-611718d9b51a" containerName="gather" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.148995 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8822de14-eaa5-4016-91fd-611718d9b51a" containerName="copy" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.149012 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="528b7541-8fac-4df9-9168-c3166532618d" containerName="oc" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.149026 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8822de14-eaa5-4016-91fd-611718d9b51a" containerName="gather" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.149796 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556768-f66sk" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.151975 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.154910 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.155302 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.186825 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556768-f66sk"] Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.208953 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7wzc\" (UniqueName: \"kubernetes.io/projected/1b297ac1-71ba-4b15-b915-a38f9da4ebb7-kube-api-access-s7wzc\") pod \"auto-csr-approver-29556768-f66sk\" (UID: \"1b297ac1-71ba-4b15-b915-a38f9da4ebb7\") " pod="openshift-infra/auto-csr-approver-29556768-f66sk" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.311924 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7wzc\" (UniqueName: \"kubernetes.io/projected/1b297ac1-71ba-4b15-b915-a38f9da4ebb7-kube-api-access-s7wzc\") pod \"auto-csr-approver-29556768-f66sk\" (UID: \"1b297ac1-71ba-4b15-b915-a38f9da4ebb7\") " pod="openshift-infra/auto-csr-approver-29556768-f66sk" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.331991 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7wzc\" (UniqueName: \"kubernetes.io/projected/1b297ac1-71ba-4b15-b915-a38f9da4ebb7-kube-api-access-s7wzc\") pod \"auto-csr-approver-29556768-f66sk\" (UID: \"1b297ac1-71ba-4b15-b915-a38f9da4ebb7\") " pod="openshift-infra/auto-csr-approver-29556768-f66sk" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.495583 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556768-f66sk" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.922608 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556768-f66sk"] Mar 13 12:48:00 crc kubenswrapper[4837]: W0313 12:48:00.931208 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b297ac1_71ba_4b15_b915_a38f9da4ebb7.slice/crio-be55f861f9827b65e56a18da973cb785ee7a3d9371ba71c4db0877383b2afeb1 WatchSource:0}: Error finding container be55f861f9827b65e56a18da973cb785ee7a3d9371ba71c4db0877383b2afeb1: Status 404 returned error can't find the container with id be55f861f9827b65e56a18da973cb785ee7a3d9371ba71c4db0877383b2afeb1 Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.933894 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:48:01 crc kubenswrapper[4837]: I0313 12:48:01.736332 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556768-f66sk" event={"ID":"1b297ac1-71ba-4b15-b915-a38f9da4ebb7","Type":"ContainerStarted","Data":"be55f861f9827b65e56a18da973cb785ee7a3d9371ba71c4db0877383b2afeb1"} Mar 13 12:48:02 crc kubenswrapper[4837]: I0313 12:48:02.749875 4837 generic.go:334] "Generic (PLEG): container finished" podID="1b297ac1-71ba-4b15-b915-a38f9da4ebb7" containerID="6da52e600ecb49afa497ca1fed54ebec9623af66e73a4cbe5e0c9804569c398b" exitCode=0 Mar 13 12:48:02 crc kubenswrapper[4837]: I0313 12:48:02.750171 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556768-f66sk" event={"ID":"1b297ac1-71ba-4b15-b915-a38f9da4ebb7","Type":"ContainerDied","Data":"6da52e600ecb49afa497ca1fed54ebec9623af66e73a4cbe5e0c9804569c398b"} Mar 13 12:48:04 crc kubenswrapper[4837]: I0313 12:48:04.104372 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556768-f66sk" Mar 13 12:48:04 crc kubenswrapper[4837]: I0313 12:48:04.277414 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7wzc\" (UniqueName: \"kubernetes.io/projected/1b297ac1-71ba-4b15-b915-a38f9da4ebb7-kube-api-access-s7wzc\") pod \"1b297ac1-71ba-4b15-b915-a38f9da4ebb7\" (UID: \"1b297ac1-71ba-4b15-b915-a38f9da4ebb7\") " Mar 13 12:48:04 crc kubenswrapper[4837]: I0313 12:48:04.284401 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b297ac1-71ba-4b15-b915-a38f9da4ebb7-kube-api-access-s7wzc" (OuterVolumeSpecName: "kube-api-access-s7wzc") pod "1b297ac1-71ba-4b15-b915-a38f9da4ebb7" (UID: "1b297ac1-71ba-4b15-b915-a38f9da4ebb7"). InnerVolumeSpecName "kube-api-access-s7wzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:48:04 crc kubenswrapper[4837]: I0313 12:48:04.379962 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7wzc\" (UniqueName: \"kubernetes.io/projected/1b297ac1-71ba-4b15-b915-a38f9da4ebb7-kube-api-access-s7wzc\") on node \"crc\" DevicePath \"\"" Mar 13 12:48:04 crc kubenswrapper[4837]: I0313 12:48:04.778488 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556768-f66sk" event={"ID":"1b297ac1-71ba-4b15-b915-a38f9da4ebb7","Type":"ContainerDied","Data":"be55f861f9827b65e56a18da973cb785ee7a3d9371ba71c4db0877383b2afeb1"} Mar 13 12:48:04 crc kubenswrapper[4837]: I0313 12:48:04.778538 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be55f861f9827b65e56a18da973cb785ee7a3d9371ba71c4db0877383b2afeb1" Mar 13 12:48:04 crc kubenswrapper[4837]: I0313 12:48:04.778552 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556768-f66sk" Mar 13 12:48:05 crc kubenswrapper[4837]: I0313 12:48:05.167977 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556762-g52qb"] Mar 13 12:48:05 crc kubenswrapper[4837]: I0313 12:48:05.176958 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556762-g52qb"] Mar 13 12:48:05 crc kubenswrapper[4837]: I0313 12:48:05.483579 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:48:05 crc kubenswrapper[4837]: I0313 12:48:05.484293 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:48:07 crc kubenswrapper[4837]: I0313 12:48:07.062014 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4" path="/var/lib/kubelet/pods/e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4/volumes" Mar 13 12:48:32 crc kubenswrapper[4837]: I0313 12:48:32.866404 4837 scope.go:117] "RemoveContainer" containerID="d6ca53672f75fdcf8f31c32bb76f3e903dae1282d3f22ff4ff5cc9e6da3282e1" Mar 13 12:48:35 crc kubenswrapper[4837]: I0313 12:48:35.483902 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:48:35 crc kubenswrapper[4837]: I0313 12:48:35.484525 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:49:05 crc kubenswrapper[4837]: I0313 12:49:05.483801 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:49:05 crc kubenswrapper[4837]: I0313 12:49:05.484325 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:49:05 crc kubenswrapper[4837]: I0313 12:49:05.484364 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 12:49:05 crc kubenswrapper[4837]: I0313 12:49:05.484949 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:49:05 crc kubenswrapper[4837]: I0313 12:49:05.485001 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" gracePeriod=600 Mar 13 12:49:05 crc kubenswrapper[4837]: E0313 12:49:05.605561 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:49:06 crc kubenswrapper[4837]: I0313 12:49:06.326329 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" exitCode=0 Mar 13 12:49:06 crc kubenswrapper[4837]: I0313 12:49:06.326377 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0"} Mar 13 12:49:06 crc kubenswrapper[4837]: I0313 12:49:06.326462 4837 scope.go:117] "RemoveContainer" containerID="42895e7497f11e52c6189b0227f4673591fee559ac68adfdd28355562f8112bd" Mar 13 12:49:06 crc kubenswrapper[4837]: I0313 12:49:06.327135 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:49:06 crc kubenswrapper[4837]: E0313 12:49:06.327436 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:49:18 crc kubenswrapper[4837]: I0313 12:49:18.048909 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:49:18 crc kubenswrapper[4837]: E0313 12:49:18.049894 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:49:32 crc kubenswrapper[4837]: I0313 12:49:32.938863 4837 scope.go:117] "RemoveContainer" containerID="021f2f7590a98a1912559c67d885639fef8ea6affc1fcb856c58211036ebcb42" Mar 13 12:49:33 crc kubenswrapper[4837]: I0313 12:49:33.048664 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:49:33 crc kubenswrapper[4837]: E0313 12:49:33.049358 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:49:44 crc kubenswrapper[4837]: I0313 12:49:44.048336 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:49:44 crc kubenswrapper[4837]: E0313 12:49:44.049136 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:49:59 crc kubenswrapper[4837]: I0313 12:49:59.048230 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:49:59 crc kubenswrapper[4837]: E0313 12:49:59.049052 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.153260 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556770-5vjcn"] Mar 13 12:50:00 crc kubenswrapper[4837]: E0313 12:50:00.153782 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b297ac1-71ba-4b15-b915-a38f9da4ebb7" containerName="oc" Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.153798 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b297ac1-71ba-4b15-b915-a38f9da4ebb7" containerName="oc" Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.154051 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b297ac1-71ba-4b15-b915-a38f9da4ebb7" containerName="oc" Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.154806 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556770-5vjcn" Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.156969 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.157066 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.157330 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.164622 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556770-5vjcn"] Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.231982 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf7lw\" (UniqueName: \"kubernetes.io/projected/39e3042e-9415-4734-bfa5-8def0b858b6e-kube-api-access-xf7lw\") pod \"auto-csr-approver-29556770-5vjcn\" (UID: \"39e3042e-9415-4734-bfa5-8def0b858b6e\") " pod="openshift-infra/auto-csr-approver-29556770-5vjcn" Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.334232 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf7lw\" (UniqueName: \"kubernetes.io/projected/39e3042e-9415-4734-bfa5-8def0b858b6e-kube-api-access-xf7lw\") pod \"auto-csr-approver-29556770-5vjcn\" (UID: \"39e3042e-9415-4734-bfa5-8def0b858b6e\") " pod="openshift-infra/auto-csr-approver-29556770-5vjcn" Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.358477 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf7lw\" (UniqueName: \"kubernetes.io/projected/39e3042e-9415-4734-bfa5-8def0b858b6e-kube-api-access-xf7lw\") pod \"auto-csr-approver-29556770-5vjcn\" (UID: \"39e3042e-9415-4734-bfa5-8def0b858b6e\") " pod="openshift-infra/auto-csr-approver-29556770-5vjcn" Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.479424 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556770-5vjcn" Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.937937 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556770-5vjcn"] Mar 13 12:50:01 crc kubenswrapper[4837]: I0313 12:50:01.856917 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556770-5vjcn" event={"ID":"39e3042e-9415-4734-bfa5-8def0b858b6e","Type":"ContainerStarted","Data":"f54e70c22239b0c1bbae84029edcf748edb7110b2fb1a844699a97a3a7ed2e7d"} Mar 13 12:50:02 crc kubenswrapper[4837]: I0313 12:50:02.868259 4837 generic.go:334] "Generic (PLEG): container finished" podID="39e3042e-9415-4734-bfa5-8def0b858b6e" containerID="8a03a622bd1e0141b38071e7ff2bc9ecddb0162408970736756d5805f18fdf44" exitCode=0 Mar 13 12:50:02 crc kubenswrapper[4837]: I0313 12:50:02.868350 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556770-5vjcn" event={"ID":"39e3042e-9415-4734-bfa5-8def0b858b6e","Type":"ContainerDied","Data":"8a03a622bd1e0141b38071e7ff2bc9ecddb0162408970736756d5805f18fdf44"} Mar 13 12:50:04 crc kubenswrapper[4837]: I0313 12:50:04.189115 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556770-5vjcn" Mar 13 12:50:04 crc kubenswrapper[4837]: I0313 12:50:04.310775 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf7lw\" (UniqueName: \"kubernetes.io/projected/39e3042e-9415-4734-bfa5-8def0b858b6e-kube-api-access-xf7lw\") pod \"39e3042e-9415-4734-bfa5-8def0b858b6e\" (UID: \"39e3042e-9415-4734-bfa5-8def0b858b6e\") " Mar 13 12:50:04 crc kubenswrapper[4837]: I0313 12:50:04.324900 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39e3042e-9415-4734-bfa5-8def0b858b6e-kube-api-access-xf7lw" (OuterVolumeSpecName: "kube-api-access-xf7lw") pod "39e3042e-9415-4734-bfa5-8def0b858b6e" (UID: "39e3042e-9415-4734-bfa5-8def0b858b6e"). InnerVolumeSpecName "kube-api-access-xf7lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:50:04 crc kubenswrapper[4837]: I0313 12:50:04.413748 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf7lw\" (UniqueName: \"kubernetes.io/projected/39e3042e-9415-4734-bfa5-8def0b858b6e-kube-api-access-xf7lw\") on node \"crc\" DevicePath \"\"" Mar 13 12:50:04 crc kubenswrapper[4837]: I0313 12:50:04.888136 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556770-5vjcn" event={"ID":"39e3042e-9415-4734-bfa5-8def0b858b6e","Type":"ContainerDied","Data":"f54e70c22239b0c1bbae84029edcf748edb7110b2fb1a844699a97a3a7ed2e7d"} Mar 13 12:50:04 crc kubenswrapper[4837]: I0313 12:50:04.888183 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f54e70c22239b0c1bbae84029edcf748edb7110b2fb1a844699a97a3a7ed2e7d" Mar 13 12:50:04 crc kubenswrapper[4837]: I0313 12:50:04.888228 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556770-5vjcn" Mar 13 12:50:05 crc kubenswrapper[4837]: I0313 12:50:05.273182 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556764-r7n49"] Mar 13 12:50:05 crc kubenswrapper[4837]: I0313 12:50:05.285746 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556764-r7n49"] Mar 13 12:50:07 crc kubenswrapper[4837]: I0313 12:50:07.059947 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e88464a-8619-4750-ac96-b1ad569fcece" path="/var/lib/kubelet/pods/7e88464a-8619-4750-ac96-b1ad569fcece/volumes" Mar 13 12:50:10 crc kubenswrapper[4837]: I0313 12:50:10.048176 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:50:10 crc kubenswrapper[4837]: E0313 12:50:10.049941 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:50:23 crc kubenswrapper[4837]: I0313 12:50:23.049139 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:50:23 crc kubenswrapper[4837]: E0313 12:50:23.049951 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.126860 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lkqv5/must-gather-vz7zz"] Mar 13 12:50:30 crc kubenswrapper[4837]: E0313 12:50:30.127462 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39e3042e-9415-4734-bfa5-8def0b858b6e" containerName="oc" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.127477 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="39e3042e-9415-4734-bfa5-8def0b858b6e" containerName="oc" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.127696 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="39e3042e-9415-4734-bfa5-8def0b858b6e" containerName="oc" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.128725 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/must-gather-vz7zz" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.133170 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lkqv5"/"kube-root-ca.crt" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.133802 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lkqv5"/"openshift-service-ca.crt" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.148993 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lkqv5/must-gather-vz7zz"] Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.224332 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzd72\" (UniqueName: \"kubernetes.io/projected/130c1c0e-31b1-415d-aab2-fab358576a73-kube-api-access-tzd72\") pod \"must-gather-vz7zz\" (UID: \"130c1c0e-31b1-415d-aab2-fab358576a73\") " pod="openshift-must-gather-lkqv5/must-gather-vz7zz" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.224400 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/130c1c0e-31b1-415d-aab2-fab358576a73-must-gather-output\") pod \"must-gather-vz7zz\" (UID: \"130c1c0e-31b1-415d-aab2-fab358576a73\") " pod="openshift-must-gather-lkqv5/must-gather-vz7zz" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.326075 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzd72\" (UniqueName: \"kubernetes.io/projected/130c1c0e-31b1-415d-aab2-fab358576a73-kube-api-access-tzd72\") pod \"must-gather-vz7zz\" (UID: \"130c1c0e-31b1-415d-aab2-fab358576a73\") " pod="openshift-must-gather-lkqv5/must-gather-vz7zz" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.326116 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/130c1c0e-31b1-415d-aab2-fab358576a73-must-gather-output\") pod \"must-gather-vz7zz\" (UID: \"130c1c0e-31b1-415d-aab2-fab358576a73\") " pod="openshift-must-gather-lkqv5/must-gather-vz7zz" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.326599 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/130c1c0e-31b1-415d-aab2-fab358576a73-must-gather-output\") pod \"must-gather-vz7zz\" (UID: \"130c1c0e-31b1-415d-aab2-fab358576a73\") " pod="openshift-must-gather-lkqv5/must-gather-vz7zz" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.351450 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzd72\" (UniqueName: \"kubernetes.io/projected/130c1c0e-31b1-415d-aab2-fab358576a73-kube-api-access-tzd72\") pod \"must-gather-vz7zz\" (UID: \"130c1c0e-31b1-415d-aab2-fab358576a73\") " pod="openshift-must-gather-lkqv5/must-gather-vz7zz" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.494771 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/must-gather-vz7zz" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.741899 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lkqv5/must-gather-vz7zz"] Mar 13 12:50:31 crc kubenswrapper[4837]: I0313 12:50:31.180883 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lkqv5/must-gather-vz7zz" event={"ID":"130c1c0e-31b1-415d-aab2-fab358576a73","Type":"ContainerStarted","Data":"433a139fea2255c45e8580415a3deca8258493b41f46198b67c0eac345fb5a75"} Mar 13 12:50:31 crc kubenswrapper[4837]: I0313 12:50:31.180937 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lkqv5/must-gather-vz7zz" event={"ID":"130c1c0e-31b1-415d-aab2-fab358576a73","Type":"ContainerStarted","Data":"67e6ef7e4be8b057b18838efc07c184fe5840d59f0bbbe9911d4187ef608fd8c"} Mar 13 12:50:32 crc kubenswrapper[4837]: I0313 12:50:32.216809 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lkqv5/must-gather-vz7zz" event={"ID":"130c1c0e-31b1-415d-aab2-fab358576a73","Type":"ContainerStarted","Data":"bc010a3c2a92443b50c947cd27f9323f2921ea8aae80c058217be8b624f5d427"} Mar 13 12:50:32 crc kubenswrapper[4837]: I0313 12:50:32.248321 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lkqv5/must-gather-vz7zz" podStartSLOduration=2.248297288 podStartE2EDuration="2.248297288s" podCreationTimestamp="2026-03-13 12:50:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:50:32.23555377 +0000 UTC m=+3747.873820533" watchObservedRunningTime="2026-03-13 12:50:32.248297288 +0000 UTC m=+3747.886564061" Mar 13 12:50:32 crc kubenswrapper[4837]: I0313 12:50:32.983883 4837 scope.go:117] "RemoveContainer" containerID="1dc84242c71f8e5d31bcd05b0ae44aeb29c8a625295bbab7f2eb79c610ba55a4" Mar 13 12:50:34 crc kubenswrapper[4837]: I0313 12:50:34.549522 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lkqv5/crc-debug-djqf7"] Mar 13 12:50:34 crc kubenswrapper[4837]: I0313 12:50:34.551014 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/crc-debug-djqf7" Mar 13 12:50:34 crc kubenswrapper[4837]: I0313 12:50:34.555451 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-lkqv5"/"default-dockercfg-p2glw" Mar 13 12:50:34 crc kubenswrapper[4837]: I0313 12:50:34.707651 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb9wk\" (UniqueName: \"kubernetes.io/projected/70913aeb-b1cd-4a84-b043-8b10d8a28196-kube-api-access-qb9wk\") pod \"crc-debug-djqf7\" (UID: \"70913aeb-b1cd-4a84-b043-8b10d8a28196\") " pod="openshift-must-gather-lkqv5/crc-debug-djqf7" Mar 13 12:50:34 crc kubenswrapper[4837]: I0313 12:50:34.707894 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70913aeb-b1cd-4a84-b043-8b10d8a28196-host\") pod \"crc-debug-djqf7\" (UID: \"70913aeb-b1cd-4a84-b043-8b10d8a28196\") " pod="openshift-must-gather-lkqv5/crc-debug-djqf7" Mar 13 12:50:34 crc kubenswrapper[4837]: I0313 12:50:34.810094 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb9wk\" (UniqueName: \"kubernetes.io/projected/70913aeb-b1cd-4a84-b043-8b10d8a28196-kube-api-access-qb9wk\") pod \"crc-debug-djqf7\" (UID: \"70913aeb-b1cd-4a84-b043-8b10d8a28196\") " pod="openshift-must-gather-lkqv5/crc-debug-djqf7" Mar 13 12:50:34 crc kubenswrapper[4837]: I0313 12:50:34.810221 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70913aeb-b1cd-4a84-b043-8b10d8a28196-host\") pod \"crc-debug-djqf7\" (UID: \"70913aeb-b1cd-4a84-b043-8b10d8a28196\") " pod="openshift-must-gather-lkqv5/crc-debug-djqf7" Mar 13 12:50:34 crc kubenswrapper[4837]: I0313 12:50:34.810310 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70913aeb-b1cd-4a84-b043-8b10d8a28196-host\") pod \"crc-debug-djqf7\" (UID: \"70913aeb-b1cd-4a84-b043-8b10d8a28196\") " pod="openshift-must-gather-lkqv5/crc-debug-djqf7" Mar 13 12:50:34 crc kubenswrapper[4837]: I0313 12:50:34.828236 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb9wk\" (UniqueName: \"kubernetes.io/projected/70913aeb-b1cd-4a84-b043-8b10d8a28196-kube-api-access-qb9wk\") pod \"crc-debug-djqf7\" (UID: \"70913aeb-b1cd-4a84-b043-8b10d8a28196\") " pod="openshift-must-gather-lkqv5/crc-debug-djqf7" Mar 13 12:50:34 crc kubenswrapper[4837]: I0313 12:50:34.868476 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/crc-debug-djqf7" Mar 13 12:50:34 crc kubenswrapper[4837]: W0313 12:50:34.917391 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70913aeb_b1cd_4a84_b043_8b10d8a28196.slice/crio-a84cdf79e4cee6ea4dcee02331b86c1229aab71ff3081c6570c2ff689e35e097 WatchSource:0}: Error finding container a84cdf79e4cee6ea4dcee02331b86c1229aab71ff3081c6570c2ff689e35e097: Status 404 returned error can't find the container with id a84cdf79e4cee6ea4dcee02331b86c1229aab71ff3081c6570c2ff689e35e097 Mar 13 12:50:35 crc kubenswrapper[4837]: I0313 12:50:35.053421 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:50:35 crc kubenswrapper[4837]: E0313 12:50:35.053906 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:50:35 crc kubenswrapper[4837]: I0313 12:50:35.242150 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lkqv5/crc-debug-djqf7" event={"ID":"70913aeb-b1cd-4a84-b043-8b10d8a28196","Type":"ContainerStarted","Data":"bd2cc56ba5a1a7ecd3acb1a078af5a3a6894476f89e6f14c15c62f2a11f1660e"} Mar 13 12:50:35 crc kubenswrapper[4837]: I0313 12:50:35.242823 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lkqv5/crc-debug-djqf7" event={"ID":"70913aeb-b1cd-4a84-b043-8b10d8a28196","Type":"ContainerStarted","Data":"a84cdf79e4cee6ea4dcee02331b86c1229aab71ff3081c6570c2ff689e35e097"} Mar 13 12:50:35 crc kubenswrapper[4837]: I0313 12:50:35.257793 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lkqv5/crc-debug-djqf7" podStartSLOduration=1.257773597 podStartE2EDuration="1.257773597s" podCreationTimestamp="2026-03-13 12:50:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:50:35.254939378 +0000 UTC m=+3750.893206141" watchObservedRunningTime="2026-03-13 12:50:35.257773597 +0000 UTC m=+3750.896040360" Mar 13 12:50:49 crc kubenswrapper[4837]: I0313 12:50:49.048678 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:50:49 crc kubenswrapper[4837]: E0313 12:50:49.049480 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:51:02 crc kubenswrapper[4837]: I0313 12:51:02.048206 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:51:02 crc kubenswrapper[4837]: E0313 12:51:02.049133 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:51:09 crc kubenswrapper[4837]: I0313 12:51:09.528795 4837 generic.go:334] "Generic (PLEG): container finished" podID="70913aeb-b1cd-4a84-b043-8b10d8a28196" containerID="bd2cc56ba5a1a7ecd3acb1a078af5a3a6894476f89e6f14c15c62f2a11f1660e" exitCode=0 Mar 13 12:51:09 crc kubenswrapper[4837]: I0313 12:51:09.528879 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lkqv5/crc-debug-djqf7" event={"ID":"70913aeb-b1cd-4a84-b043-8b10d8a28196","Type":"ContainerDied","Data":"bd2cc56ba5a1a7ecd3acb1a078af5a3a6894476f89e6f14c15c62f2a11f1660e"} Mar 13 12:51:10 crc kubenswrapper[4837]: I0313 12:51:10.672838 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/crc-debug-djqf7" Mar 13 12:51:10 crc kubenswrapper[4837]: I0313 12:51:10.704535 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lkqv5/crc-debug-djqf7"] Mar 13 12:51:10 crc kubenswrapper[4837]: I0313 12:51:10.716188 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lkqv5/crc-debug-djqf7"] Mar 13 12:51:10 crc kubenswrapper[4837]: I0313 12:51:10.796588 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70913aeb-b1cd-4a84-b043-8b10d8a28196-host\") pod \"70913aeb-b1cd-4a84-b043-8b10d8a28196\" (UID: \"70913aeb-b1cd-4a84-b043-8b10d8a28196\") " Mar 13 12:51:10 crc kubenswrapper[4837]: I0313 12:51:10.796784 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb9wk\" (UniqueName: \"kubernetes.io/projected/70913aeb-b1cd-4a84-b043-8b10d8a28196-kube-api-access-qb9wk\") pod \"70913aeb-b1cd-4a84-b043-8b10d8a28196\" (UID: \"70913aeb-b1cd-4a84-b043-8b10d8a28196\") " Mar 13 12:51:10 crc kubenswrapper[4837]: I0313 12:51:10.797091 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/70913aeb-b1cd-4a84-b043-8b10d8a28196-host" (OuterVolumeSpecName: "host") pod "70913aeb-b1cd-4a84-b043-8b10d8a28196" (UID: "70913aeb-b1cd-4a84-b043-8b10d8a28196"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:51:10 crc kubenswrapper[4837]: I0313 12:51:10.797483 4837 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70913aeb-b1cd-4a84-b043-8b10d8a28196-host\") on node \"crc\" DevicePath \"\"" Mar 13 12:51:10 crc kubenswrapper[4837]: I0313 12:51:10.805902 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70913aeb-b1cd-4a84-b043-8b10d8a28196-kube-api-access-qb9wk" (OuterVolumeSpecName: "kube-api-access-qb9wk") pod "70913aeb-b1cd-4a84-b043-8b10d8a28196" (UID: "70913aeb-b1cd-4a84-b043-8b10d8a28196"). InnerVolumeSpecName "kube-api-access-qb9wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:51:10 crc kubenswrapper[4837]: I0313 12:51:10.899382 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb9wk\" (UniqueName: \"kubernetes.io/projected/70913aeb-b1cd-4a84-b043-8b10d8a28196-kube-api-access-qb9wk\") on node \"crc\" DevicePath \"\"" Mar 13 12:51:11 crc kubenswrapper[4837]: I0313 12:51:11.067625 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70913aeb-b1cd-4a84-b043-8b10d8a28196" path="/var/lib/kubelet/pods/70913aeb-b1cd-4a84-b043-8b10d8a28196/volumes" Mar 13 12:51:11 crc kubenswrapper[4837]: I0313 12:51:11.548249 4837 scope.go:117] "RemoveContainer" containerID="bd2cc56ba5a1a7ecd3acb1a078af5a3a6894476f89e6f14c15c62f2a11f1660e" Mar 13 12:51:11 crc kubenswrapper[4837]: I0313 12:51:11.548288 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/crc-debug-djqf7" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.082264 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lkqv5/crc-debug-4ctbv"] Mar 13 12:51:12 crc kubenswrapper[4837]: E0313 12:51:12.084066 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70913aeb-b1cd-4a84-b043-8b10d8a28196" containerName="container-00" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.084174 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="70913aeb-b1cd-4a84-b043-8b10d8a28196" containerName="container-00" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.084436 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="70913aeb-b1cd-4a84-b043-8b10d8a28196" containerName="container-00" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.085189 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/crc-debug-4ctbv" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.088014 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-lkqv5"/"default-dockercfg-p2glw" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.221562 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c98ae53-ab32-4810-8dc5-6989adb356d5-host\") pod \"crc-debug-4ctbv\" (UID: \"0c98ae53-ab32-4810-8dc5-6989adb356d5\") " pod="openshift-must-gather-lkqv5/crc-debug-4ctbv" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.221765 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkw9c\" (UniqueName: \"kubernetes.io/projected/0c98ae53-ab32-4810-8dc5-6989adb356d5-kube-api-access-kkw9c\") pod \"crc-debug-4ctbv\" (UID: \"0c98ae53-ab32-4810-8dc5-6989adb356d5\") " pod="openshift-must-gather-lkqv5/crc-debug-4ctbv" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.323686 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c98ae53-ab32-4810-8dc5-6989adb356d5-host\") pod \"crc-debug-4ctbv\" (UID: \"0c98ae53-ab32-4810-8dc5-6989adb356d5\") " pod="openshift-must-gather-lkqv5/crc-debug-4ctbv" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.323813 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c98ae53-ab32-4810-8dc5-6989adb356d5-host\") pod \"crc-debug-4ctbv\" (UID: \"0c98ae53-ab32-4810-8dc5-6989adb356d5\") " pod="openshift-must-gather-lkqv5/crc-debug-4ctbv" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.324196 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkw9c\" (UniqueName: \"kubernetes.io/projected/0c98ae53-ab32-4810-8dc5-6989adb356d5-kube-api-access-kkw9c\") pod \"crc-debug-4ctbv\" (UID: \"0c98ae53-ab32-4810-8dc5-6989adb356d5\") " pod="openshift-must-gather-lkqv5/crc-debug-4ctbv" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.348862 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkw9c\" (UniqueName: \"kubernetes.io/projected/0c98ae53-ab32-4810-8dc5-6989adb356d5-kube-api-access-kkw9c\") pod \"crc-debug-4ctbv\" (UID: \"0c98ae53-ab32-4810-8dc5-6989adb356d5\") " pod="openshift-must-gather-lkqv5/crc-debug-4ctbv" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.406343 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/crc-debug-4ctbv" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.562955 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lkqv5/crc-debug-4ctbv" event={"ID":"0c98ae53-ab32-4810-8dc5-6989adb356d5","Type":"ContainerStarted","Data":"60a594404f1ed7f723a9298dbd0a5df295668ef5d66ffda156410c502f7d3cf2"} Mar 13 12:51:13 crc kubenswrapper[4837]: I0313 12:51:13.795771 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:51:13 crc kubenswrapper[4837]: E0313 12:51:13.796450 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:51:13 crc kubenswrapper[4837]: I0313 12:51:13.805431 4837 generic.go:334] "Generic (PLEG): container finished" podID="0c98ae53-ab32-4810-8dc5-6989adb356d5" containerID="37226ff0b2678a7a2f65c8a485d9c9dab1a4017df2a7227ff6688c23ec1e7cd8" exitCode=0 Mar 13 12:51:13 crc kubenswrapper[4837]: I0313 12:51:13.805487 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lkqv5/crc-debug-4ctbv" event={"ID":"0c98ae53-ab32-4810-8dc5-6989adb356d5","Type":"ContainerDied","Data":"37226ff0b2678a7a2f65c8a485d9c9dab1a4017df2a7227ff6688c23ec1e7cd8"} Mar 13 12:51:14 crc kubenswrapper[4837]: I0313 12:51:14.242539 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lkqv5/crc-debug-4ctbv"] Mar 13 12:51:14 crc kubenswrapper[4837]: I0313 12:51:14.273331 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lkqv5/crc-debug-4ctbv"] Mar 13 12:51:14 crc kubenswrapper[4837]: I0313 12:51:14.915439 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/crc-debug-4ctbv" Mar 13 12:51:14 crc kubenswrapper[4837]: I0313 12:51:14.975521 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkw9c\" (UniqueName: \"kubernetes.io/projected/0c98ae53-ab32-4810-8dc5-6989adb356d5-kube-api-access-kkw9c\") pod \"0c98ae53-ab32-4810-8dc5-6989adb356d5\" (UID: \"0c98ae53-ab32-4810-8dc5-6989adb356d5\") " Mar 13 12:51:14 crc kubenswrapper[4837]: I0313 12:51:14.975664 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c98ae53-ab32-4810-8dc5-6989adb356d5-host\") pod \"0c98ae53-ab32-4810-8dc5-6989adb356d5\" (UID: \"0c98ae53-ab32-4810-8dc5-6989adb356d5\") " Mar 13 12:51:14 crc kubenswrapper[4837]: I0313 12:51:14.976226 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c98ae53-ab32-4810-8dc5-6989adb356d5-host" (OuterVolumeSpecName: "host") pod "0c98ae53-ab32-4810-8dc5-6989adb356d5" (UID: "0c98ae53-ab32-4810-8dc5-6989adb356d5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:51:14 crc kubenswrapper[4837]: I0313 12:51:14.981901 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c98ae53-ab32-4810-8dc5-6989adb356d5-kube-api-access-kkw9c" (OuterVolumeSpecName: "kube-api-access-kkw9c") pod "0c98ae53-ab32-4810-8dc5-6989adb356d5" (UID: "0c98ae53-ab32-4810-8dc5-6989adb356d5"). InnerVolumeSpecName "kube-api-access-kkw9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.067393 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c98ae53-ab32-4810-8dc5-6989adb356d5" path="/var/lib/kubelet/pods/0c98ae53-ab32-4810-8dc5-6989adb356d5/volumes" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.078213 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkw9c\" (UniqueName: \"kubernetes.io/projected/0c98ae53-ab32-4810-8dc5-6989adb356d5-kube-api-access-kkw9c\") on node \"crc\" DevicePath \"\"" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.078515 4837 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c98ae53-ab32-4810-8dc5-6989adb356d5-host\") on node \"crc\" DevicePath \"\"" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.504055 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lkqv5/crc-debug-vdrpl"] Mar 13 12:51:15 crc kubenswrapper[4837]: E0313 12:51:15.504554 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c98ae53-ab32-4810-8dc5-6989adb356d5" containerName="container-00" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.504577 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c98ae53-ab32-4810-8dc5-6989adb356d5" containerName="container-00" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.504921 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c98ae53-ab32-4810-8dc5-6989adb356d5" containerName="container-00" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.505696 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/crc-debug-vdrpl" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.588871 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d11db9b4-2c6a-422b-9de4-ba64c73d8db8-host\") pod \"crc-debug-vdrpl\" (UID: \"d11db9b4-2c6a-422b-9de4-ba64c73d8db8\") " pod="openshift-must-gather-lkqv5/crc-debug-vdrpl" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.589019 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t5h7\" (UniqueName: \"kubernetes.io/projected/d11db9b4-2c6a-422b-9de4-ba64c73d8db8-kube-api-access-4t5h7\") pod \"crc-debug-vdrpl\" (UID: \"d11db9b4-2c6a-422b-9de4-ba64c73d8db8\") " pod="openshift-must-gather-lkqv5/crc-debug-vdrpl" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.690938 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t5h7\" (UniqueName: \"kubernetes.io/projected/d11db9b4-2c6a-422b-9de4-ba64c73d8db8-kube-api-access-4t5h7\") pod \"crc-debug-vdrpl\" (UID: \"d11db9b4-2c6a-422b-9de4-ba64c73d8db8\") " pod="openshift-must-gather-lkqv5/crc-debug-vdrpl" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.691033 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d11db9b4-2c6a-422b-9de4-ba64c73d8db8-host\") pod \"crc-debug-vdrpl\" (UID: \"d11db9b4-2c6a-422b-9de4-ba64c73d8db8\") " pod="openshift-must-gather-lkqv5/crc-debug-vdrpl" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.691137 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d11db9b4-2c6a-422b-9de4-ba64c73d8db8-host\") pod \"crc-debug-vdrpl\" (UID: \"d11db9b4-2c6a-422b-9de4-ba64c73d8db8\") " pod="openshift-must-gather-lkqv5/crc-debug-vdrpl" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.708387 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t5h7\" (UniqueName: \"kubernetes.io/projected/d11db9b4-2c6a-422b-9de4-ba64c73d8db8-kube-api-access-4t5h7\") pod \"crc-debug-vdrpl\" (UID: \"d11db9b4-2c6a-422b-9de4-ba64c73d8db8\") " pod="openshift-must-gather-lkqv5/crc-debug-vdrpl" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.824447 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/crc-debug-vdrpl" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.825858 4837 scope.go:117] "RemoveContainer" containerID="37226ff0b2678a7a2f65c8a485d9c9dab1a4017df2a7227ff6688c23ec1e7cd8" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.825897 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/crc-debug-4ctbv" Mar 13 12:51:15 crc kubenswrapper[4837]: W0313 12:51:15.871721 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd11db9b4_2c6a_422b_9de4_ba64c73d8db8.slice/crio-8c5b501aee306ba1ef1efa0cce4a88cbfbd6bbed51084cd64ab93d5534477206 WatchSource:0}: Error finding container 8c5b501aee306ba1ef1efa0cce4a88cbfbd6bbed51084cd64ab93d5534477206: Status 404 returned error can't find the container with id 8c5b501aee306ba1ef1efa0cce4a88cbfbd6bbed51084cd64ab93d5534477206 Mar 13 12:51:16 crc kubenswrapper[4837]: I0313 12:51:16.835135 4837 generic.go:334] "Generic (PLEG): container finished" podID="d11db9b4-2c6a-422b-9de4-ba64c73d8db8" containerID="54fb761872f281d7c0b689685e33a78b40ef7b8a8fa21695c3ff8545600aa7ca" exitCode=0 Mar 13 12:51:16 crc kubenswrapper[4837]: I0313 12:51:16.835222 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lkqv5/crc-debug-vdrpl" event={"ID":"d11db9b4-2c6a-422b-9de4-ba64c73d8db8","Type":"ContainerDied","Data":"54fb761872f281d7c0b689685e33a78b40ef7b8a8fa21695c3ff8545600aa7ca"} Mar 13 12:51:16 crc kubenswrapper[4837]: I0313 12:51:16.835461 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lkqv5/crc-debug-vdrpl" event={"ID":"d11db9b4-2c6a-422b-9de4-ba64c73d8db8","Type":"ContainerStarted","Data":"8c5b501aee306ba1ef1efa0cce4a88cbfbd6bbed51084cd64ab93d5534477206"} Mar 13 12:51:16 crc kubenswrapper[4837]: I0313 12:51:16.867889 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lkqv5/crc-debug-vdrpl"] Mar 13 12:51:16 crc kubenswrapper[4837]: I0313 12:51:16.876304 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lkqv5/crc-debug-vdrpl"] Mar 13 12:51:17 crc kubenswrapper[4837]: I0313 12:51:17.986716 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/crc-debug-vdrpl" Mar 13 12:51:18 crc kubenswrapper[4837]: I0313 12:51:18.033921 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t5h7\" (UniqueName: \"kubernetes.io/projected/d11db9b4-2c6a-422b-9de4-ba64c73d8db8-kube-api-access-4t5h7\") pod \"d11db9b4-2c6a-422b-9de4-ba64c73d8db8\" (UID: \"d11db9b4-2c6a-422b-9de4-ba64c73d8db8\") " Mar 13 12:51:18 crc kubenswrapper[4837]: I0313 12:51:18.034075 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d11db9b4-2c6a-422b-9de4-ba64c73d8db8-host\") pod \"d11db9b4-2c6a-422b-9de4-ba64c73d8db8\" (UID: \"d11db9b4-2c6a-422b-9de4-ba64c73d8db8\") " Mar 13 12:51:18 crc kubenswrapper[4837]: I0313 12:51:18.034206 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d11db9b4-2c6a-422b-9de4-ba64c73d8db8-host" (OuterVolumeSpecName: "host") pod "d11db9b4-2c6a-422b-9de4-ba64c73d8db8" (UID: "d11db9b4-2c6a-422b-9de4-ba64c73d8db8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:51:18 crc kubenswrapper[4837]: I0313 12:51:18.034623 4837 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d11db9b4-2c6a-422b-9de4-ba64c73d8db8-host\") on node \"crc\" DevicePath \"\"" Mar 13 12:51:18 crc kubenswrapper[4837]: I0313 12:51:18.040167 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11db9b4-2c6a-422b-9de4-ba64c73d8db8-kube-api-access-4t5h7" (OuterVolumeSpecName: "kube-api-access-4t5h7") pod "d11db9b4-2c6a-422b-9de4-ba64c73d8db8" (UID: "d11db9b4-2c6a-422b-9de4-ba64c73d8db8"). InnerVolumeSpecName "kube-api-access-4t5h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:51:18 crc kubenswrapper[4837]: I0313 12:51:18.136864 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t5h7\" (UniqueName: \"kubernetes.io/projected/d11db9b4-2c6a-422b-9de4-ba64c73d8db8-kube-api-access-4t5h7\") on node \"crc\" DevicePath \"\"" Mar 13 12:51:18 crc kubenswrapper[4837]: I0313 12:51:18.855976 4837 scope.go:117] "RemoveContainer" containerID="54fb761872f281d7c0b689685e33a78b40ef7b8a8fa21695c3ff8545600aa7ca" Mar 13 12:51:18 crc kubenswrapper[4837]: I0313 12:51:18.856019 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/crc-debug-vdrpl" Mar 13 12:51:19 crc kubenswrapper[4837]: I0313 12:51:19.058228 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d11db9b4-2c6a-422b-9de4-ba64c73d8db8" path="/var/lib/kubelet/pods/d11db9b4-2c6a-422b-9de4-ba64c73d8db8/volumes" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.485935 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jn4zc"] Mar 13 12:51:22 crc kubenswrapper[4837]: E0313 12:51:22.486360 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11db9b4-2c6a-422b-9de4-ba64c73d8db8" containerName="container-00" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.486372 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11db9b4-2c6a-422b-9de4-ba64c73d8db8" containerName="container-00" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.486573 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="d11db9b4-2c6a-422b-9de4-ba64c73d8db8" containerName="container-00" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.489896 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.510021 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jn4zc"] Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.627127 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n5vh\" (UniqueName: \"kubernetes.io/projected/e35b70ca-0247-45d0-aee2-1fb91eefa45c-kube-api-access-9n5vh\") pod \"certified-operators-jn4zc\" (UID: \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\") " pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.627381 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e35b70ca-0247-45d0-aee2-1fb91eefa45c-utilities\") pod \"certified-operators-jn4zc\" (UID: \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\") " pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.627434 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e35b70ca-0247-45d0-aee2-1fb91eefa45c-catalog-content\") pod \"certified-operators-jn4zc\" (UID: \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\") " pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.729806 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e35b70ca-0247-45d0-aee2-1fb91eefa45c-utilities\") pod \"certified-operators-jn4zc\" (UID: \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\") " pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.729864 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e35b70ca-0247-45d0-aee2-1fb91eefa45c-catalog-content\") pod \"certified-operators-jn4zc\" (UID: \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\") " pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.730000 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n5vh\" (UniqueName: \"kubernetes.io/projected/e35b70ca-0247-45d0-aee2-1fb91eefa45c-kube-api-access-9n5vh\") pod \"certified-operators-jn4zc\" (UID: \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\") " pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.730326 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e35b70ca-0247-45d0-aee2-1fb91eefa45c-utilities\") pod \"certified-operators-jn4zc\" (UID: \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\") " pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.730469 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e35b70ca-0247-45d0-aee2-1fb91eefa45c-catalog-content\") pod \"certified-operators-jn4zc\" (UID: \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\") " pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.767722 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n5vh\" (UniqueName: \"kubernetes.io/projected/e35b70ca-0247-45d0-aee2-1fb91eefa45c-kube-api-access-9n5vh\") pod \"certified-operators-jn4zc\" (UID: \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\") " pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.817609 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:23 crc kubenswrapper[4837]: I0313 12:51:23.546498 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jn4zc"] Mar 13 12:51:24 crc kubenswrapper[4837]: I0313 12:51:24.109236 4837 generic.go:334] "Generic (PLEG): container finished" podID="e35b70ca-0247-45d0-aee2-1fb91eefa45c" containerID="4cfa78b841c4bfd68fc339c7da7ba7a470ed2e7b9319732dc821f3f2f5974993" exitCode=0 Mar 13 12:51:24 crc kubenswrapper[4837]: I0313 12:51:24.109291 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn4zc" event={"ID":"e35b70ca-0247-45d0-aee2-1fb91eefa45c","Type":"ContainerDied","Data":"4cfa78b841c4bfd68fc339c7da7ba7a470ed2e7b9319732dc821f3f2f5974993"} Mar 13 12:51:24 crc kubenswrapper[4837]: I0313 12:51:24.109325 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn4zc" event={"ID":"e35b70ca-0247-45d0-aee2-1fb91eefa45c","Type":"ContainerStarted","Data":"da8784cb3a7c82be8f8439b52b3069bcb72b04da847b72613804c85ea2bb6618"} Mar 13 12:51:25 crc kubenswrapper[4837]: I0313 12:51:25.125405 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn4zc" event={"ID":"e35b70ca-0247-45d0-aee2-1fb91eefa45c","Type":"ContainerStarted","Data":"018ea4b7b5731d21bb3788eb633c0a2608f7d7437cbf7d79b0b4ecfffaa5bb54"} Mar 13 12:51:27 crc kubenswrapper[4837]: I0313 12:51:27.049805 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:51:27 crc kubenswrapper[4837]: E0313 12:51:27.051378 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:51:27 crc kubenswrapper[4837]: I0313 12:51:27.152243 4837 generic.go:334] "Generic (PLEG): container finished" podID="e35b70ca-0247-45d0-aee2-1fb91eefa45c" containerID="018ea4b7b5731d21bb3788eb633c0a2608f7d7437cbf7d79b0b4ecfffaa5bb54" exitCode=0 Mar 13 12:51:27 crc kubenswrapper[4837]: I0313 12:51:27.152314 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn4zc" event={"ID":"e35b70ca-0247-45d0-aee2-1fb91eefa45c","Type":"ContainerDied","Data":"018ea4b7b5731d21bb3788eb633c0a2608f7d7437cbf7d79b0b4ecfffaa5bb54"} Mar 13 12:51:28 crc kubenswrapper[4837]: I0313 12:51:28.162890 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn4zc" event={"ID":"e35b70ca-0247-45d0-aee2-1fb91eefa45c","Type":"ContainerStarted","Data":"75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73"} Mar 13 12:51:28 crc kubenswrapper[4837]: I0313 12:51:28.190509 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jn4zc" podStartSLOduration=2.756689093 podStartE2EDuration="6.190487637s" podCreationTimestamp="2026-03-13 12:51:22 +0000 UTC" firstStartedPulling="2026-03-13 12:51:24.11215402 +0000 UTC m=+3799.750420783" lastFinishedPulling="2026-03-13 12:51:27.545952564 +0000 UTC m=+3803.184219327" observedRunningTime="2026-03-13 12:51:28.181349612 +0000 UTC m=+3803.819616375" watchObservedRunningTime="2026-03-13 12:51:28.190487637 +0000 UTC m=+3803.828754400" Mar 13 12:51:32 crc kubenswrapper[4837]: I0313 12:51:32.818990 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:32 crc kubenswrapper[4837]: I0313 12:51:32.819658 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:32 crc kubenswrapper[4837]: I0313 12:51:32.882274 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:33 crc kubenswrapper[4837]: I0313 12:51:33.244294 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:33 crc kubenswrapper[4837]: I0313 12:51:33.292574 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jn4zc"] Mar 13 12:51:35 crc kubenswrapper[4837]: I0313 12:51:35.216774 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jn4zc" podUID="e35b70ca-0247-45d0-aee2-1fb91eefa45c" containerName="registry-server" containerID="cri-o://75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73" gracePeriod=2 Mar 13 12:51:35 crc kubenswrapper[4837]: I0313 12:51:35.685007 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:35 crc kubenswrapper[4837]: I0313 12:51:35.824816 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e35b70ca-0247-45d0-aee2-1fb91eefa45c-catalog-content\") pod \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\" (UID: \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\") " Mar 13 12:51:35 crc kubenswrapper[4837]: I0313 12:51:35.824930 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n5vh\" (UniqueName: \"kubernetes.io/projected/e35b70ca-0247-45d0-aee2-1fb91eefa45c-kube-api-access-9n5vh\") pod \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\" (UID: \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\") " Mar 13 12:51:35 crc kubenswrapper[4837]: I0313 12:51:35.825066 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e35b70ca-0247-45d0-aee2-1fb91eefa45c-utilities\") pod \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\" (UID: \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\") " Mar 13 12:51:35 crc kubenswrapper[4837]: I0313 12:51:35.825822 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e35b70ca-0247-45d0-aee2-1fb91eefa45c-utilities" (OuterVolumeSpecName: "utilities") pod "e35b70ca-0247-45d0-aee2-1fb91eefa45c" (UID: "e35b70ca-0247-45d0-aee2-1fb91eefa45c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:51:35 crc kubenswrapper[4837]: I0313 12:51:35.833500 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e35b70ca-0247-45d0-aee2-1fb91eefa45c-kube-api-access-9n5vh" (OuterVolumeSpecName: "kube-api-access-9n5vh") pod "e35b70ca-0247-45d0-aee2-1fb91eefa45c" (UID: "e35b70ca-0247-45d0-aee2-1fb91eefa45c"). InnerVolumeSpecName "kube-api-access-9n5vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:51:35 crc kubenswrapper[4837]: I0313 12:51:35.880947 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e35b70ca-0247-45d0-aee2-1fb91eefa45c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e35b70ca-0247-45d0-aee2-1fb91eefa45c" (UID: "e35b70ca-0247-45d0-aee2-1fb91eefa45c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:51:35 crc kubenswrapper[4837]: I0313 12:51:35.927481 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e35b70ca-0247-45d0-aee2-1fb91eefa45c-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:51:35 crc kubenswrapper[4837]: I0313 12:51:35.927825 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e35b70ca-0247-45d0-aee2-1fb91eefa45c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:51:35 crc kubenswrapper[4837]: I0313 12:51:35.927840 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n5vh\" (UniqueName: \"kubernetes.io/projected/e35b70ca-0247-45d0-aee2-1fb91eefa45c-kube-api-access-9n5vh\") on node \"crc\" DevicePath \"\"" Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.228522 4837 generic.go:334] "Generic (PLEG): container finished" podID="e35b70ca-0247-45d0-aee2-1fb91eefa45c" containerID="75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73" exitCode=0 Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.228563 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn4zc" event={"ID":"e35b70ca-0247-45d0-aee2-1fb91eefa45c","Type":"ContainerDied","Data":"75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73"} Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.228587 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.228604 4837 scope.go:117] "RemoveContainer" containerID="75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73" Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.228593 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn4zc" event={"ID":"e35b70ca-0247-45d0-aee2-1fb91eefa45c","Type":"ContainerDied","Data":"da8784cb3a7c82be8f8439b52b3069bcb72b04da847b72613804c85ea2bb6618"} Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.250014 4837 scope.go:117] "RemoveContainer" containerID="018ea4b7b5731d21bb3788eb633c0a2608f7d7437cbf7d79b0b4ecfffaa5bb54" Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.281698 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jn4zc"] Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.288698 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jn4zc"] Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.294868 4837 scope.go:117] "RemoveContainer" containerID="4cfa78b841c4bfd68fc339c7da7ba7a470ed2e7b9319732dc821f3f2f5974993" Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.318748 4837 scope.go:117] "RemoveContainer" containerID="75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73" Mar 13 12:51:36 crc kubenswrapper[4837]: E0313 12:51:36.319315 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73\": container with ID starting with 75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73 not found: ID does not exist" containerID="75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73" Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.319372 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73"} err="failed to get container status \"75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73\": rpc error: code = NotFound desc = could not find container \"75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73\": container with ID starting with 75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73 not found: ID does not exist" Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.319414 4837 scope.go:117] "RemoveContainer" containerID="018ea4b7b5731d21bb3788eb633c0a2608f7d7437cbf7d79b0b4ecfffaa5bb54" Mar 13 12:51:36 crc kubenswrapper[4837]: E0313 12:51:36.320533 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"018ea4b7b5731d21bb3788eb633c0a2608f7d7437cbf7d79b0b4ecfffaa5bb54\": container with ID starting with 018ea4b7b5731d21bb3788eb633c0a2608f7d7437cbf7d79b0b4ecfffaa5bb54 not found: ID does not exist" containerID="018ea4b7b5731d21bb3788eb633c0a2608f7d7437cbf7d79b0b4ecfffaa5bb54" Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.320564 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"018ea4b7b5731d21bb3788eb633c0a2608f7d7437cbf7d79b0b4ecfffaa5bb54"} err="failed to get container status \"018ea4b7b5731d21bb3788eb633c0a2608f7d7437cbf7d79b0b4ecfffaa5bb54\": rpc error: code = NotFound desc = could not find container \"018ea4b7b5731d21bb3788eb633c0a2608f7d7437cbf7d79b0b4ecfffaa5bb54\": container with ID starting with 018ea4b7b5731d21bb3788eb633c0a2608f7d7437cbf7d79b0b4ecfffaa5bb54 not found: ID does not exist" Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.320586 4837 scope.go:117] "RemoveContainer" containerID="4cfa78b841c4bfd68fc339c7da7ba7a470ed2e7b9319732dc821f3f2f5974993" Mar 13 12:51:36 crc kubenswrapper[4837]: E0313 12:51:36.321929 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cfa78b841c4bfd68fc339c7da7ba7a470ed2e7b9319732dc821f3f2f5974993\": container with ID starting with 4cfa78b841c4bfd68fc339c7da7ba7a470ed2e7b9319732dc821f3f2f5974993 not found: ID does not exist" containerID="4cfa78b841c4bfd68fc339c7da7ba7a470ed2e7b9319732dc821f3f2f5974993" Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.321954 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cfa78b841c4bfd68fc339c7da7ba7a470ed2e7b9319732dc821f3f2f5974993"} err="failed to get container status \"4cfa78b841c4bfd68fc339c7da7ba7a470ed2e7b9319732dc821f3f2f5974993\": rpc error: code = NotFound desc = could not find container \"4cfa78b841c4bfd68fc339c7da7ba7a470ed2e7b9319732dc821f3f2f5974993\": container with ID starting with 4cfa78b841c4bfd68fc339c7da7ba7a470ed2e7b9319732dc821f3f2f5974993 not found: ID does not exist" Mar 13 12:51:37 crc kubenswrapper[4837]: I0313 12:51:37.062536 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e35b70ca-0247-45d0-aee2-1fb91eefa45c" path="/var/lib/kubelet/pods/e35b70ca-0247-45d0-aee2-1fb91eefa45c/volumes" Mar 13 12:51:41 crc kubenswrapper[4837]: I0313 12:51:41.051616 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:51:41 crc kubenswrapper[4837]: E0313 12:51:41.052166 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:51:46 crc kubenswrapper[4837]: I0313 12:51:46.172068 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6d84f6b8c8-8rrwq_74c7e377-b579-47bc-a992-cca0cf047627/barbican-api/0.log" Mar 13 12:51:46 crc kubenswrapper[4837]: I0313 12:51:46.298851 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6d84f6b8c8-8rrwq_74c7e377-b579-47bc-a992-cca0cf047627/barbican-api-log/0.log" Mar 13 12:51:46 crc kubenswrapper[4837]: I0313 12:51:46.373380 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58c489697d-dgjtz_d1cfe08e-23bd-4f52-ab3c-3d68377de2a9/barbican-keystone-listener/0.log" Mar 13 12:51:46 crc kubenswrapper[4837]: I0313 12:51:46.400264 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58c489697d-dgjtz_d1cfe08e-23bd-4f52-ab3c-3d68377de2a9/barbican-keystone-listener-log/0.log" Mar 13 12:51:46 crc kubenswrapper[4837]: I0313 12:51:46.565214 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6f4ff9ff9-mjmsz_55084c82-a823-4f31-926e-21702ba02ba1/barbican-worker/0.log" Mar 13 12:51:46 crc kubenswrapper[4837]: I0313 12:51:46.652249 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6f4ff9ff9-mjmsz_55084c82-a823-4f31-926e-21702ba02ba1/barbican-worker-log/0.log" Mar 13 12:51:46 crc kubenswrapper[4837]: I0313 12:51:46.785441 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj_2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:46 crc kubenswrapper[4837]: I0313 12:51:46.863502 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_82b5b509-a674-4a89-a7cc-c01c7bfca144/ceilometer-central-agent/0.log" Mar 13 12:51:46 crc kubenswrapper[4837]: I0313 12:51:46.921570 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_82b5b509-a674-4a89-a7cc-c01c7bfca144/ceilometer-notification-agent/0.log" Mar 13 12:51:46 crc kubenswrapper[4837]: I0313 12:51:46.976840 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_82b5b509-a674-4a89-a7cc-c01c7bfca144/proxy-httpd/0.log" Mar 13 12:51:47 crc kubenswrapper[4837]: I0313 12:51:47.047723 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_82b5b509-a674-4a89-a7cc-c01c7bfca144/sg-core/0.log" Mar 13 12:51:47 crc kubenswrapper[4837]: I0313 12:51:47.223064 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a8004928-50bc-4db8-a701-4458c42bc776/cinder-api-log/0.log" Mar 13 12:51:47 crc kubenswrapper[4837]: I0313 12:51:47.228126 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a8004928-50bc-4db8-a701-4458c42bc776/cinder-api/0.log" Mar 13 12:51:47 crc kubenswrapper[4837]: I0313 12:51:47.392737 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_580b8861-16eb-4142-bd61-6d0221a07f4d/cinder-scheduler/0.log" Mar 13 12:51:47 crc kubenswrapper[4837]: I0313 12:51:47.440353 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_580b8861-16eb-4142-bd61-6d0221a07f4d/probe/0.log" Mar 13 12:51:47 crc kubenswrapper[4837]: I0313 12:51:47.503375 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-s95mk_875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:47 crc kubenswrapper[4837]: I0313 12:51:47.618566 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp_0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:47 crc kubenswrapper[4837]: I0313 12:51:47.726579 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-bxc2t_98f4bdc5-6452-4630-a299-6234d8a63bf8/init/0.log" Mar 13 12:51:47 crc kubenswrapper[4837]: I0313 12:51:47.854766 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-bxc2t_98f4bdc5-6452-4630-a299-6234d8a63bf8/init/0.log" Mar 13 12:51:47 crc kubenswrapper[4837]: I0313 12:51:47.912096 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts_121f6d1b-1277-4d68-8a48-6c4630dd6fe5/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:47 crc kubenswrapper[4837]: I0313 12:51:47.978928 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-bxc2t_98f4bdc5-6452-4630-a299-6234d8a63bf8/dnsmasq-dns/0.log" Mar 13 12:51:48 crc kubenswrapper[4837]: I0313 12:51:48.109855 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d3f87d89-35d5-4dc0-9c37-5297718a9351/glance-log/0.log" Mar 13 12:51:48 crc kubenswrapper[4837]: I0313 12:51:48.146855 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d3f87d89-35d5-4dc0-9c37-5297718a9351/glance-httpd/0.log" Mar 13 12:51:48 crc kubenswrapper[4837]: I0313 12:51:48.263328 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d0f3b003-127f-414f-877a-8f7df2872049/glance-httpd/0.log" Mar 13 12:51:48 crc kubenswrapper[4837]: I0313 12:51:48.337054 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d0f3b003-127f-414f-877a-8f7df2872049/glance-log/0.log" Mar 13 12:51:48 crc kubenswrapper[4837]: I0313 12:51:48.490825 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-fd6ddfd9b-f66l8_4d3df345-07a2-41bf-aae4-088b3ce83b63/horizon/0.log" Mar 13 12:51:48 crc kubenswrapper[4837]: I0313 12:51:48.668814 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-gj59c_6cc8d0dd-d1e6-4374-bb90-aaefc9197350/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:48 crc kubenswrapper[4837]: I0313 12:51:48.759136 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-2b48q_033a02c2-cbe4-4676-ae46-f9b9b17a60fb/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:48 crc kubenswrapper[4837]: I0313 12:51:48.822227 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-fd6ddfd9b-f66l8_4d3df345-07a2-41bf-aae4-088b3ce83b63/horizon-log/0.log" Mar 13 12:51:49 crc kubenswrapper[4837]: I0313 12:51:49.089075 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_abd69ff2-e72e-40c0-925f-d0c1c0a40f9a/kube-state-metrics/0.log" Mar 13 12:51:49 crc kubenswrapper[4837]: I0313 12:51:49.117843 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-55dc4d44f8-mvjvg_9cb9614d-a433-4be3-8145-4c1c8593404f/keystone-api/0.log" Mar 13 12:51:49 crc kubenswrapper[4837]: I0313 12:51:49.243608 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5_394104d4-0291-4071-a7da-d7b71e0f4083/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:49 crc kubenswrapper[4837]: I0313 12:51:49.586754 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-667d547b9-4p8qm_3c00dfc0-061b-43ba-b529-a89c9157a0cf/neutron-api/0.log" Mar 13 12:51:49 crc kubenswrapper[4837]: I0313 12:51:49.931147 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4_20f35066-9c10-4433-a655-f5cef18d4deb/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:50 crc kubenswrapper[4837]: I0313 12:51:50.014881 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-667d547b9-4p8qm_3c00dfc0-061b-43ba-b529-a89c9157a0cf/neutron-httpd/0.log" Mar 13 12:51:50 crc kubenswrapper[4837]: I0313 12:51:50.591936 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4e6cd1d9-f670-4e94-8322-44e471c3be71/nova-api-log/0.log" Mar 13 12:51:50 crc kubenswrapper[4837]: I0313 12:51:50.722045 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_58240a84-c8ab-43a9-8113-eaf2d0ddea2e/nova-cell0-conductor-conductor/0.log" Mar 13 12:51:50 crc kubenswrapper[4837]: I0313 12:51:50.954062 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_9a51debb-c1cb-4a55-b845-e89d89d11e86/nova-cell1-conductor-conductor/0.log" Mar 13 12:51:51 crc kubenswrapper[4837]: I0313 12:51:51.028993 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4e6cd1d9-f670-4e94-8322-44e471c3be71/nova-api-api/0.log" Mar 13 12:51:51 crc kubenswrapper[4837]: I0313 12:51:51.030945 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_662e258d-fe94-4373-912d-c906f1e93c90/nova-cell1-novncproxy-novncproxy/0.log" Mar 13 12:51:51 crc kubenswrapper[4837]: I0313 12:51:51.233005 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-4jdmk_e6986f16-e143-49f4-81e5-58abba717876/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:51 crc kubenswrapper[4837]: I0313 12:51:51.813078 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7faa5418-aa48-4e20-830c-bb171cfea0d9/nova-metadata-log/0.log" Mar 13 12:51:52 crc kubenswrapper[4837]: I0313 12:51:52.048664 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_362e31d4-ea62-40ed-8426-982d47559472/mysql-bootstrap/0.log" Mar 13 12:51:52 crc kubenswrapper[4837]: I0313 12:51:52.257937 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d380e047-7297-4835-b948-6c86c6b6aa27/nova-scheduler-scheduler/0.log" Mar 13 12:51:52 crc kubenswrapper[4837]: I0313 12:51:52.262410 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_362e31d4-ea62-40ed-8426-982d47559472/mysql-bootstrap/0.log" Mar 13 12:51:52 crc kubenswrapper[4837]: I0313 12:51:52.322434 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_362e31d4-ea62-40ed-8426-982d47559472/galera/0.log" Mar 13 12:51:52 crc kubenswrapper[4837]: I0313 12:51:52.462277 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd/mysql-bootstrap/0.log" Mar 13 12:51:52 crc kubenswrapper[4837]: I0313 12:51:52.715227 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd/mysql-bootstrap/0.log" Mar 13 12:51:52 crc kubenswrapper[4837]: I0313 12:51:52.742371 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd/galera/0.log" Mar 13 12:51:52 crc kubenswrapper[4837]: I0313 12:51:52.878140 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_5d15c820-a2ee-4d4c-986f-2c2f09b43f79/openstackclient/0.log" Mar 13 12:51:52 crc kubenswrapper[4837]: I0313 12:51:52.946265 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-w69p6_18eb496a-7d9f-4bf6-af71-3b7b585d0f7d/openstack-network-exporter/0.log" Mar 13 12:51:53 crc kubenswrapper[4837]: I0313 12:51:53.071048 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7faa5418-aa48-4e20-830c-bb171cfea0d9/nova-metadata-metadata/0.log" Mar 13 12:51:53 crc kubenswrapper[4837]: I0313 12:51:53.191871 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-nbhpw_32dc51d9-5638-4530-91c8-5be8c13e60f3/ovn-controller/0.log" Mar 13 12:51:53 crc kubenswrapper[4837]: I0313 12:51:53.335694 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ls998_71e00962-6b2f-495c-8f34-52993f66cef9/ovsdb-server-init/0.log" Mar 13 12:51:53 crc kubenswrapper[4837]: I0313 12:51:53.465058 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ls998_71e00962-6b2f-495c-8f34-52993f66cef9/ovsdb-server/0.log" Mar 13 12:51:53 crc kubenswrapper[4837]: I0313 12:51:53.475464 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ls998_71e00962-6b2f-495c-8f34-52993f66cef9/ovsdb-server-init/0.log" Mar 13 12:51:53 crc kubenswrapper[4837]: I0313 12:51:53.520120 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ls998_71e00962-6b2f-495c-8f34-52993f66cef9/ovs-vswitchd/0.log" Mar 13 12:51:53 crc kubenswrapper[4837]: I0313 12:51:53.703849 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-kbffp_092bd277-504a-450d-aca1-d8ecc18f0c9f/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:53 crc kubenswrapper[4837]: I0313 12:51:53.766443 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_25ea0f5e-e277-4944-8c9d-2c7709e1a8cf/openstack-network-exporter/0.log" Mar 13 12:51:53 crc kubenswrapper[4837]: I0313 12:51:53.873849 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_25ea0f5e-e277-4944-8c9d-2c7709e1a8cf/ovn-northd/0.log" Mar 13 12:51:53 crc kubenswrapper[4837]: I0313 12:51:53.951917 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_38d61ffe-3c44-4657-bc91-d849f766a3e1/openstack-network-exporter/0.log" Mar 13 12:51:54 crc kubenswrapper[4837]: I0313 12:51:54.051914 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_38d61ffe-3c44-4657-bc91-d849f766a3e1/ovsdbserver-nb/0.log" Mar 13 12:51:54 crc kubenswrapper[4837]: I0313 12:51:54.189411 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3d10fcb0-4d45-45bf-a663-971b8ce74010/openstack-network-exporter/0.log" Mar 13 12:51:54 crc kubenswrapper[4837]: I0313 12:51:54.206533 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3d10fcb0-4d45-45bf-a663-971b8ce74010/ovsdbserver-sb/0.log" Mar 13 12:51:54 crc kubenswrapper[4837]: I0313 12:51:54.456118 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-59f7b5dc8d-rnsz6_07eece9e-0e59-4a06-8fea-efb4217d6907/placement-api/0.log" Mar 13 12:51:54 crc kubenswrapper[4837]: I0313 12:51:54.474925 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-59f7b5dc8d-rnsz6_07eece9e-0e59-4a06-8fea-efb4217d6907/placement-log/0.log" Mar 13 12:51:54 crc kubenswrapper[4837]: I0313 12:51:54.516369 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_90028d66-5134-4c09-af15-71e754f49bf3/setup-container/0.log" Mar 13 12:51:54 crc kubenswrapper[4837]: I0313 12:51:54.735267 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_90028d66-5134-4c09-af15-71e754f49bf3/setup-container/0.log" Mar 13 12:51:54 crc kubenswrapper[4837]: I0313 12:51:54.843009 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_90028d66-5134-4c09-af15-71e754f49bf3/rabbitmq/0.log" Mar 13 12:51:54 crc kubenswrapper[4837]: I0313 12:51:54.851885 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_245e5a26-d143-4e4d-bae8-094275a91574/setup-container/0.log" Mar 13 12:51:55 crc kubenswrapper[4837]: I0313 12:51:55.015246 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_245e5a26-d143-4e4d-bae8-094275a91574/setup-container/0.log" Mar 13 12:51:55 crc kubenswrapper[4837]: I0313 12:51:55.051398 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_245e5a26-d143-4e4d-bae8-094275a91574/rabbitmq/0.log" Mar 13 12:51:55 crc kubenswrapper[4837]: I0313 12:51:55.062569 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:51:55 crc kubenswrapper[4837]: E0313 12:51:55.062950 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:51:55 crc kubenswrapper[4837]: I0313 12:51:55.100232 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d_3b96ea7e-2148-4659-9a26-3335c88888c1/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:55 crc kubenswrapper[4837]: I0313 12:51:55.310305 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-pz9nt_0b7402b1-0b76-4ffa-b37f-6e014183f6a6/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:55 crc kubenswrapper[4837]: I0313 12:51:55.342305 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6_bfedd3e5-e8d7-4311-9a0d-30276ce40418/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:55 crc kubenswrapper[4837]: I0313 12:51:55.608563 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-s6jdp_f12ac62a-2011-4e89-a16f-e136959f9d1a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:55 crc kubenswrapper[4837]: I0313 12:51:55.611840 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-vjnpx_4ddcb794-ab03-4308-a93c-c5929ed96e01/ssh-known-hosts-edpm-deployment/0.log" Mar 13 12:51:55 crc kubenswrapper[4837]: I0313 12:51:55.852619 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-bfbc874dc-vsh7q_36ffa543-526d-4d56-b599-06fcfe0988cf/proxy-server/0.log" Mar 13 12:51:55 crc kubenswrapper[4837]: I0313 12:51:55.952953 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-bfbc874dc-vsh7q_36ffa543-526d-4d56-b599-06fcfe0988cf/proxy-httpd/0.log" Mar 13 12:51:56 crc kubenswrapper[4837]: I0313 12:51:56.055602 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-69xgx_24998567-afa6-4adc-a503-4fc054946aef/swift-ring-rebalance/0.log" Mar 13 12:51:56 crc kubenswrapper[4837]: I0313 12:51:56.329153 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/account-auditor/0.log" Mar 13 12:51:56 crc kubenswrapper[4837]: I0313 12:51:56.433249 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/account-reaper/0.log" Mar 13 12:51:56 crc kubenswrapper[4837]: I0313 12:51:56.463865 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/account-replicator/0.log" Mar 13 12:51:56 crc kubenswrapper[4837]: I0313 12:51:56.478912 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/account-server/0.log" Mar 13 12:51:56 crc kubenswrapper[4837]: I0313 12:51:56.552860 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/container-auditor/0.log" Mar 13 12:51:56 crc kubenswrapper[4837]: I0313 12:51:56.672760 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/container-replicator/0.log" Mar 13 12:51:56 crc kubenswrapper[4837]: I0313 12:51:56.735422 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/container-server/0.log" Mar 13 12:51:56 crc kubenswrapper[4837]: I0313 12:51:56.754499 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/container-updater/0.log" Mar 13 12:51:56 crc kubenswrapper[4837]: I0313 12:51:56.819607 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/object-auditor/0.log" Mar 13 12:51:56 crc kubenswrapper[4837]: I0313 12:51:56.869515 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/object-expirer/0.log" Mar 13 12:51:56 crc kubenswrapper[4837]: I0313 12:51:56.974620 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/object-replicator/0.log" Mar 13 12:51:57 crc kubenswrapper[4837]: I0313 12:51:56.998241 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/object-server/0.log" Mar 13 12:51:57 crc kubenswrapper[4837]: I0313 12:51:57.064304 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/object-updater/0.log" Mar 13 12:51:57 crc kubenswrapper[4837]: I0313 12:51:57.129817 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/rsync/0.log" Mar 13 12:51:57 crc kubenswrapper[4837]: I0313 12:51:57.189083 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/swift-recon-cron/0.log" Mar 13 12:51:57 crc kubenswrapper[4837]: I0313 12:51:57.351050 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x_ac15848f-4f6f-4159-828f-d30a77f93a4b/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:57 crc kubenswrapper[4837]: I0313 12:51:57.461124 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_66bdda91-c5b6-4879-9adf-21846884c797/tempest-tests-tempest-tests-runner/0.log" Mar 13 12:51:57 crc kubenswrapper[4837]: I0313 12:51:57.538024 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_0244acef-b630-4b97-9bb5-9f99de391613/test-operator-logs-container/0.log" Mar 13 12:51:57 crc kubenswrapper[4837]: I0313 12:51:57.709723 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-42br8_e3ec33da-9091-4eb1-aafa-62b9bdf16072/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.160498 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556772-5k6fp"] Mar 13 12:52:00 crc kubenswrapper[4837]: E0313 12:52:00.161148 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e35b70ca-0247-45d0-aee2-1fb91eefa45c" containerName="extract-content" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.161161 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35b70ca-0247-45d0-aee2-1fb91eefa45c" containerName="extract-content" Mar 13 12:52:00 crc kubenswrapper[4837]: E0313 12:52:00.161185 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e35b70ca-0247-45d0-aee2-1fb91eefa45c" containerName="extract-utilities" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.161191 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35b70ca-0247-45d0-aee2-1fb91eefa45c" containerName="extract-utilities" Mar 13 12:52:00 crc kubenswrapper[4837]: E0313 12:52:00.161201 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e35b70ca-0247-45d0-aee2-1fb91eefa45c" containerName="registry-server" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.161207 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35b70ca-0247-45d0-aee2-1fb91eefa45c" containerName="registry-server" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.161407 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e35b70ca-0247-45d0-aee2-1fb91eefa45c" containerName="registry-server" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.162035 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556772-5k6fp" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.165740 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.165818 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.165984 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.170674 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556772-5k6fp"] Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.248136 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vcsv\" (UniqueName: \"kubernetes.io/projected/1ba6a258-c015-4c82-b7d0-736ea4ddf3a0-kube-api-access-6vcsv\") pod \"auto-csr-approver-29556772-5k6fp\" (UID: \"1ba6a258-c015-4c82-b7d0-736ea4ddf3a0\") " pod="openshift-infra/auto-csr-approver-29556772-5k6fp" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.349594 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vcsv\" (UniqueName: \"kubernetes.io/projected/1ba6a258-c015-4c82-b7d0-736ea4ddf3a0-kube-api-access-6vcsv\") pod \"auto-csr-approver-29556772-5k6fp\" (UID: \"1ba6a258-c015-4c82-b7d0-736ea4ddf3a0\") " pod="openshift-infra/auto-csr-approver-29556772-5k6fp" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.382340 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vcsv\" (UniqueName: \"kubernetes.io/projected/1ba6a258-c015-4c82-b7d0-736ea4ddf3a0-kube-api-access-6vcsv\") pod \"auto-csr-approver-29556772-5k6fp\" (UID: \"1ba6a258-c015-4c82-b7d0-736ea4ddf3a0\") " pod="openshift-infra/auto-csr-approver-29556772-5k6fp" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.490035 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556772-5k6fp" Mar 13 12:52:01 crc kubenswrapper[4837]: I0313 12:52:01.013464 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556772-5k6fp"] Mar 13 12:52:01 crc kubenswrapper[4837]: I0313 12:52:01.457155 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556772-5k6fp" event={"ID":"1ba6a258-c015-4c82-b7d0-736ea4ddf3a0","Type":"ContainerStarted","Data":"b708ecc8449a33bcb49fe655201d03102cefaaa202afe9f71e72e16452298a0f"} Mar 13 12:52:02 crc kubenswrapper[4837]: I0313 12:52:02.466572 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556772-5k6fp" event={"ID":"1ba6a258-c015-4c82-b7d0-736ea4ddf3a0","Type":"ContainerStarted","Data":"202f4741378dc74444f14dd2386ad8db9f6a085bd9e8216a4ebc85b491ab3c81"} Mar 13 12:52:02 crc kubenswrapper[4837]: I0313 12:52:02.486186 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556772-5k6fp" podStartSLOduration=1.391804637 podStartE2EDuration="2.486169278s" podCreationTimestamp="2026-03-13 12:52:00 +0000 UTC" firstStartedPulling="2026-03-13 12:52:01.001591513 +0000 UTC m=+3836.639858296" lastFinishedPulling="2026-03-13 12:52:02.095956174 +0000 UTC m=+3837.734222937" observedRunningTime="2026-03-13 12:52:02.482572665 +0000 UTC m=+3838.120839438" watchObservedRunningTime="2026-03-13 12:52:02.486169278 +0000 UTC m=+3838.124436041" Mar 13 12:52:03 crc kubenswrapper[4837]: I0313 12:52:03.480188 4837 generic.go:334] "Generic (PLEG): container finished" podID="1ba6a258-c015-4c82-b7d0-736ea4ddf3a0" containerID="202f4741378dc74444f14dd2386ad8db9f6a085bd9e8216a4ebc85b491ab3c81" exitCode=0 Mar 13 12:52:03 crc kubenswrapper[4837]: I0313 12:52:03.480259 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556772-5k6fp" event={"ID":"1ba6a258-c015-4c82-b7d0-736ea4ddf3a0","Type":"ContainerDied","Data":"202f4741378dc74444f14dd2386ad8db9f6a085bd9e8216a4ebc85b491ab3c81"} Mar 13 12:52:04 crc kubenswrapper[4837]: I0313 12:52:04.867933 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556772-5k6fp" Mar 13 12:52:05 crc kubenswrapper[4837]: I0313 12:52:05.035176 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vcsv\" (UniqueName: \"kubernetes.io/projected/1ba6a258-c015-4c82-b7d0-736ea4ddf3a0-kube-api-access-6vcsv\") pod \"1ba6a258-c015-4c82-b7d0-736ea4ddf3a0\" (UID: \"1ba6a258-c015-4c82-b7d0-736ea4ddf3a0\") " Mar 13 12:52:05 crc kubenswrapper[4837]: I0313 12:52:05.049934 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ba6a258-c015-4c82-b7d0-736ea4ddf3a0-kube-api-access-6vcsv" (OuterVolumeSpecName: "kube-api-access-6vcsv") pod "1ba6a258-c015-4c82-b7d0-736ea4ddf3a0" (UID: "1ba6a258-c015-4c82-b7d0-736ea4ddf3a0"). InnerVolumeSpecName "kube-api-access-6vcsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:52:05 crc kubenswrapper[4837]: I0313 12:52:05.137702 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vcsv\" (UniqueName: \"kubernetes.io/projected/1ba6a258-c015-4c82-b7d0-736ea4ddf3a0-kube-api-access-6vcsv\") on node \"crc\" DevicePath \"\"" Mar 13 12:52:05 crc kubenswrapper[4837]: I0313 12:52:05.500949 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556772-5k6fp" event={"ID":"1ba6a258-c015-4c82-b7d0-736ea4ddf3a0","Type":"ContainerDied","Data":"b708ecc8449a33bcb49fe655201d03102cefaaa202afe9f71e72e16452298a0f"} Mar 13 12:52:05 crc kubenswrapper[4837]: I0313 12:52:05.501325 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b708ecc8449a33bcb49fe655201d03102cefaaa202afe9f71e72e16452298a0f" Mar 13 12:52:05 crc kubenswrapper[4837]: I0313 12:52:05.501407 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556772-5k6fp" Mar 13 12:52:05 crc kubenswrapper[4837]: I0313 12:52:05.572262 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556766-lgkks"] Mar 13 12:52:05 crc kubenswrapper[4837]: I0313 12:52:05.581894 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556766-lgkks"] Mar 13 12:52:07 crc kubenswrapper[4837]: I0313 12:52:07.084103 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="528b7541-8fac-4df9-9168-c3166532618d" path="/var/lib/kubelet/pods/528b7541-8fac-4df9-9168-c3166532618d/volumes" Mar 13 12:52:08 crc kubenswrapper[4837]: E0313 12:52:08.550165 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ba6a258_c015_4c82_b7d0_736ea4ddf3a0.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:52:08 crc kubenswrapper[4837]: I0313 12:52:08.650723 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_ae39431b-5fa4-4a09-b76f-44b4d256c129/memcached/0.log" Mar 13 12:52:09 crc kubenswrapper[4837]: I0313 12:52:09.048650 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:52:09 crc kubenswrapper[4837]: E0313 12:52:09.048949 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:52:14 crc kubenswrapper[4837]: I0313 12:52:14.924336 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dklw6"] Mar 13 12:52:14 crc kubenswrapper[4837]: E0313 12:52:14.926264 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ba6a258-c015-4c82-b7d0-736ea4ddf3a0" containerName="oc" Mar 13 12:52:14 crc kubenswrapper[4837]: I0313 12:52:14.926407 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba6a258-c015-4c82-b7d0-736ea4ddf3a0" containerName="oc" Mar 13 12:52:14 crc kubenswrapper[4837]: I0313 12:52:14.926688 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ba6a258-c015-4c82-b7d0-736ea4ddf3a0" containerName="oc" Mar 13 12:52:14 crc kubenswrapper[4837]: I0313 12:52:14.928261 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:14 crc kubenswrapper[4837]: I0313 12:52:14.939421 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dklw6"] Mar 13 12:52:15 crc kubenswrapper[4837]: I0313 12:52:15.030246 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-catalog-content\") pod \"redhat-operators-dklw6\" (UID: \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\") " pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:15 crc kubenswrapper[4837]: I0313 12:52:15.030312 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-utilities\") pod \"redhat-operators-dklw6\" (UID: \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\") " pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:15 crc kubenswrapper[4837]: I0313 12:52:15.030627 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ts7w\" (UniqueName: \"kubernetes.io/projected/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-kube-api-access-5ts7w\") pod \"redhat-operators-dklw6\" (UID: \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\") " pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:15 crc kubenswrapper[4837]: I0313 12:52:15.132926 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ts7w\" (UniqueName: \"kubernetes.io/projected/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-kube-api-access-5ts7w\") pod \"redhat-operators-dklw6\" (UID: \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\") " pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:15 crc kubenswrapper[4837]: I0313 12:52:15.133231 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-catalog-content\") pod \"redhat-operators-dklw6\" (UID: \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\") " pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:15 crc kubenswrapper[4837]: I0313 12:52:15.133391 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-utilities\") pod \"redhat-operators-dklw6\" (UID: \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\") " pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:15 crc kubenswrapper[4837]: I0313 12:52:15.133945 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-catalog-content\") pod \"redhat-operators-dklw6\" (UID: \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\") " pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:15 crc kubenswrapper[4837]: I0313 12:52:15.133996 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-utilities\") pod \"redhat-operators-dklw6\" (UID: \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\") " pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:15 crc kubenswrapper[4837]: I0313 12:52:15.154928 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ts7w\" (UniqueName: \"kubernetes.io/projected/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-kube-api-access-5ts7w\") pod \"redhat-operators-dklw6\" (UID: \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\") " pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:15 crc kubenswrapper[4837]: I0313 12:52:15.258070 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:15 crc kubenswrapper[4837]: I0313 12:52:15.757817 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dklw6"] Mar 13 12:52:16 crc kubenswrapper[4837]: I0313 12:52:16.604274 4837 generic.go:334] "Generic (PLEG): container finished" podID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" containerID="466256fcf8dddb5197d5eca2dd860959fb51e832e51a92f91899b05aa8236f7b" exitCode=0 Mar 13 12:52:16 crc kubenswrapper[4837]: I0313 12:52:16.604345 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dklw6" event={"ID":"f6c415fc-c9b8-4e4f-9116-024c9bee94b0","Type":"ContainerDied","Data":"466256fcf8dddb5197d5eca2dd860959fb51e832e51a92f91899b05aa8236f7b"} Mar 13 12:52:16 crc kubenswrapper[4837]: I0313 12:52:16.604603 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dklw6" event={"ID":"f6c415fc-c9b8-4e4f-9116-024c9bee94b0","Type":"ContainerStarted","Data":"a5c56fb8f59524a48c89f88654723d62ee24c6c21452fed4f86ed021cc052e22"} Mar 13 12:52:17 crc kubenswrapper[4837]: I0313 12:52:17.615385 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dklw6" event={"ID":"f6c415fc-c9b8-4e4f-9116-024c9bee94b0","Type":"ContainerStarted","Data":"105c28a231210141f36873f32a835117fd67229caf2bcd61214068a7268ee516"} Mar 13 12:52:18 crc kubenswrapper[4837]: E0313 12:52:18.779629 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ba6a258_c015_4c82_b7d0_736ea4ddf3a0.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:52:20 crc kubenswrapper[4837]: I0313 12:52:20.051350 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:52:20 crc kubenswrapper[4837]: E0313 12:52:20.052349 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:52:22 crc kubenswrapper[4837]: I0313 12:52:22.656974 4837 generic.go:334] "Generic (PLEG): container finished" podID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" containerID="105c28a231210141f36873f32a835117fd67229caf2bcd61214068a7268ee516" exitCode=0 Mar 13 12:52:22 crc kubenswrapper[4837]: I0313 12:52:22.657055 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dklw6" event={"ID":"f6c415fc-c9b8-4e4f-9116-024c9bee94b0","Type":"ContainerDied","Data":"105c28a231210141f36873f32a835117fd67229caf2bcd61214068a7268ee516"} Mar 13 12:52:24 crc kubenswrapper[4837]: I0313 12:52:24.675653 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dklw6" event={"ID":"f6c415fc-c9b8-4e4f-9116-024c9bee94b0","Type":"ContainerStarted","Data":"2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87"} Mar 13 12:52:24 crc kubenswrapper[4837]: I0313 12:52:24.700475 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dklw6" podStartSLOduration=3.44520617 podStartE2EDuration="10.700452771s" podCreationTimestamp="2026-03-13 12:52:14 +0000 UTC" firstStartedPulling="2026-03-13 12:52:16.607686116 +0000 UTC m=+3852.245952879" lastFinishedPulling="2026-03-13 12:52:23.862932717 +0000 UTC m=+3859.501199480" observedRunningTime="2026-03-13 12:52:24.692306275 +0000 UTC m=+3860.330573038" watchObservedRunningTime="2026-03-13 12:52:24.700452771 +0000 UTC m=+3860.338719544" Mar 13 12:52:25 crc kubenswrapper[4837]: I0313 12:52:25.258159 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:25 crc kubenswrapper[4837]: I0313 12:52:25.258209 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:25 crc kubenswrapper[4837]: I0313 12:52:25.658155 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-b7cdx_e645f00a-8463-4fac-b010-f0500b54d68a/manager/0.log" Mar 13 12:52:25 crc kubenswrapper[4837]: I0313 12:52:25.902833 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/util/0.log" Mar 13 12:52:26 crc kubenswrapper[4837]: I0313 12:52:26.124207 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/pull/0.log" Mar 13 12:52:26 crc kubenswrapper[4837]: I0313 12:52:26.181032 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/util/0.log" Mar 13 12:52:26 crc kubenswrapper[4837]: I0313 12:52:26.303873 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dklw6" podUID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" containerName="registry-server" probeResult="failure" output=< Mar 13 12:52:26 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Mar 13 12:52:26 crc kubenswrapper[4837]: > Mar 13 12:52:26 crc kubenswrapper[4837]: I0313 12:52:26.426693 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/pull/0.log" Mar 13 12:52:26 crc kubenswrapper[4837]: I0313 12:52:26.571838 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/util/0.log" Mar 13 12:52:26 crc kubenswrapper[4837]: I0313 12:52:26.627065 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/pull/0.log" Mar 13 12:52:26 crc kubenswrapper[4837]: I0313 12:52:26.789466 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/extract/0.log" Mar 13 12:52:26 crc kubenswrapper[4837]: I0313 12:52:26.800111 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-kbn8z_0a24601d-8e41-4f99-9e33-870d791a3e7e/manager/0.log" Mar 13 12:52:27 crc kubenswrapper[4837]: I0313 12:52:27.250047 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-mrgb9_1870e3ae-40fd-479c-9aa7-9ce3a3e2dd2e/manager/0.log" Mar 13 12:52:27 crc kubenswrapper[4837]: I0313 12:52:27.425076 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-ss4rm_b2c881d7-03db-4608-a3f4-9a9ad8b2f5da/manager/0.log" Mar 13 12:52:27 crc kubenswrapper[4837]: I0313 12:52:27.600411 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-bvmr7_11a29883-0638-4da4-a1dc-bf2127a3645c/manager/0.log" Mar 13 12:52:27 crc kubenswrapper[4837]: I0313 12:52:27.975202 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-9zvxf_89e6d6f8-7bd3-4862-b41c-cd5c1f05f3e5/manager/0.log" Mar 13 12:52:28 crc kubenswrapper[4837]: I0313 12:52:28.258619 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-fhlk9_c19c3466-ab50-4be3-8299-d7b8b3d263df/manager/0.log" Mar 13 12:52:28 crc kubenswrapper[4837]: I0313 12:52:28.331079 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-kc2x6_9bd066a9-3999-405a-b619-540678a46ded/manager/0.log" Mar 13 12:52:28 crc kubenswrapper[4837]: I0313 12:52:28.447546 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-twrg7_fa1b1ba2-3856-49cb-bda4-8ac5e63b5298/manager/0.log" Mar 13 12:52:28 crc kubenswrapper[4837]: I0313 12:52:28.769462 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-7nm95_046bdee0-f0cf-4d17-916b-68d301502473/manager/0.log" Mar 13 12:52:29 crc kubenswrapper[4837]: E0313 12:52:29.026140 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ba6a258_c015_4c82_b7d0_736ea4ddf3a0.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:52:29 crc kubenswrapper[4837]: I0313 12:52:29.507627 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-6ht9l_3059d7c0-2624-4d3e-af0f-de054401f1ec/manager/0.log" Mar 13 12:52:29 crc kubenswrapper[4837]: I0313 12:52:29.522223 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-shrx7_ee1c592d-7979-4b75-b8e4-7ccd6d7d6048/manager/0.log" Mar 13 12:52:29 crc kubenswrapper[4837]: I0313 12:52:29.867825 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-7f7zd_561aed86-f289-4dd1-8c53-307ccdc99165/manager/0.log" Mar 13 12:52:30 crc kubenswrapper[4837]: I0313 12:52:30.040783 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b77x9vc_7b38159c-e030-4734-963d-dfc38d29c75c/manager/0.log" Mar 13 12:52:30 crc kubenswrapper[4837]: I0313 12:52:30.334062 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-c99df78b8-qxmfb_4f8c5e9e-7680-4bc3-8096-0c62a1de4da5/operator/0.log" Mar 13 12:52:30 crc kubenswrapper[4837]: I0313 12:52:30.518908 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-mdjzs_9da10ec5-aa1b-4797-91ce-04a91266831a/registry-server/0.log" Mar 13 12:52:31 crc kubenswrapper[4837]: I0313 12:52:31.174452 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-nxwr9_5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d/manager/0.log" Mar 13 12:52:31 crc kubenswrapper[4837]: I0313 12:52:31.470358 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-fwblp_35a21ab1-95b5-446a-ae10-d004e5aa2995/manager/0.log" Mar 13 12:52:31 crc kubenswrapper[4837]: I0313 12:52:31.559779 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-jvdqq_1d59bb7f-598d-4c70-9b8c-ce4e3048691f/manager/0.log" Mar 13 12:52:31 crc kubenswrapper[4837]: I0313 12:52:31.606966 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xkk4z_ce0c89e1-3fc0-473d-875f-461c8b423061/operator/0.log" Mar 13 12:52:31 crc kubenswrapper[4837]: I0313 12:52:31.815440 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-55876d85bb-96mp7_eaf3fa29-f441-43df-9fbe-409d9d8ad871/manager/0.log" Mar 13 12:52:31 crc kubenswrapper[4837]: I0313 12:52:31.864877 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-8lkmx_cb20db22-bd0e-4897-8ed6-a6a80a91ffff/manager/0.log" Mar 13 12:52:31 crc kubenswrapper[4837]: I0313 12:52:31.938919 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-cfv8z_55649f1c-678e-4e03-be55-7c4435446199/manager/0.log" Mar 13 12:52:32 crc kubenswrapper[4837]: I0313 12:52:32.114901 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-dk4nr_fe107e39-b5ec-473d-8851-b57775dadafc/manager/0.log" Mar 13 12:52:32 crc kubenswrapper[4837]: I0313 12:52:32.119871 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-hrcp9_5ef20b1d-5c03-4993-b635-b031ddcab3bf/manager/0.log" Mar 13 12:52:33 crc kubenswrapper[4837]: I0313 12:52:33.116594 4837 scope.go:117] "RemoveContainer" containerID="ef54f8788c02611298b8e701f2aadf7ddd17abb3f2f5d777925238c78d7d9c68" Mar 13 12:52:34 crc kubenswrapper[4837]: I0313 12:52:34.048515 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:52:34 crc kubenswrapper[4837]: E0313 12:52:34.049144 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:52:35 crc kubenswrapper[4837]: I0313 12:52:35.308891 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:35 crc kubenswrapper[4837]: I0313 12:52:35.357148 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:35 crc kubenswrapper[4837]: I0313 12:52:35.548810 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dklw6"] Mar 13 12:52:36 crc kubenswrapper[4837]: I0313 12:52:36.785905 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dklw6" podUID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" containerName="registry-server" containerID="cri-o://2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87" gracePeriod=2 Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.311550 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.469151 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-catalog-content\") pod \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\" (UID: \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\") " Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.469350 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-utilities\") pod \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\" (UID: \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\") " Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.469390 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ts7w\" (UniqueName: \"kubernetes.io/projected/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-kube-api-access-5ts7w\") pod \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\" (UID: \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\") " Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.470104 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-utilities" (OuterVolumeSpecName: "utilities") pod "f6c415fc-c9b8-4e4f-9116-024c9bee94b0" (UID: "f6c415fc-c9b8-4e4f-9116-024c9bee94b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.478133 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-kube-api-access-5ts7w" (OuterVolumeSpecName: "kube-api-access-5ts7w") pod "f6c415fc-c9b8-4e4f-9116-024c9bee94b0" (UID: "f6c415fc-c9b8-4e4f-9116-024c9bee94b0"). InnerVolumeSpecName "kube-api-access-5ts7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.571831 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.572213 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ts7w\" (UniqueName: \"kubernetes.io/projected/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-kube-api-access-5ts7w\") on node \"crc\" DevicePath \"\"" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.607778 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6c415fc-c9b8-4e4f-9116-024c9bee94b0" (UID: "f6c415fc-c9b8-4e4f-9116-024c9bee94b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.674117 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.798369 4837 generic.go:334] "Generic (PLEG): container finished" podID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" containerID="2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87" exitCode=0 Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.798413 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dklw6" event={"ID":"f6c415fc-c9b8-4e4f-9116-024c9bee94b0","Type":"ContainerDied","Data":"2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87"} Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.798443 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dklw6" event={"ID":"f6c415fc-c9b8-4e4f-9116-024c9bee94b0","Type":"ContainerDied","Data":"a5c56fb8f59524a48c89f88654723d62ee24c6c21452fed4f86ed021cc052e22"} Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.798465 4837 scope.go:117] "RemoveContainer" containerID="2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.798476 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.821396 4837 scope.go:117] "RemoveContainer" containerID="105c28a231210141f36873f32a835117fd67229caf2bcd61214068a7268ee516" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.840781 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dklw6"] Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.856714 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dklw6"] Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.871587 4837 scope.go:117] "RemoveContainer" containerID="466256fcf8dddb5197d5eca2dd860959fb51e832e51a92f91899b05aa8236f7b" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.899827 4837 scope.go:117] "RemoveContainer" containerID="2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87" Mar 13 12:52:37 crc kubenswrapper[4837]: E0313 12:52:37.900262 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87\": container with ID starting with 2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87 not found: ID does not exist" containerID="2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.900355 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87"} err="failed to get container status \"2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87\": rpc error: code = NotFound desc = could not find container \"2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87\": container with ID starting with 2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87 not found: ID does not exist" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.900387 4837 scope.go:117] "RemoveContainer" containerID="105c28a231210141f36873f32a835117fd67229caf2bcd61214068a7268ee516" Mar 13 12:52:37 crc kubenswrapper[4837]: E0313 12:52:37.900696 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"105c28a231210141f36873f32a835117fd67229caf2bcd61214068a7268ee516\": container with ID starting with 105c28a231210141f36873f32a835117fd67229caf2bcd61214068a7268ee516 not found: ID does not exist" containerID="105c28a231210141f36873f32a835117fd67229caf2bcd61214068a7268ee516" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.900749 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"105c28a231210141f36873f32a835117fd67229caf2bcd61214068a7268ee516"} err="failed to get container status \"105c28a231210141f36873f32a835117fd67229caf2bcd61214068a7268ee516\": rpc error: code = NotFound desc = could not find container \"105c28a231210141f36873f32a835117fd67229caf2bcd61214068a7268ee516\": container with ID starting with 105c28a231210141f36873f32a835117fd67229caf2bcd61214068a7268ee516 not found: ID does not exist" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.900785 4837 scope.go:117] "RemoveContainer" containerID="466256fcf8dddb5197d5eca2dd860959fb51e832e51a92f91899b05aa8236f7b" Mar 13 12:52:37 crc kubenswrapper[4837]: E0313 12:52:37.901058 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"466256fcf8dddb5197d5eca2dd860959fb51e832e51a92f91899b05aa8236f7b\": container with ID starting with 466256fcf8dddb5197d5eca2dd860959fb51e832e51a92f91899b05aa8236f7b not found: ID does not exist" containerID="466256fcf8dddb5197d5eca2dd860959fb51e832e51a92f91899b05aa8236f7b" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.901094 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"466256fcf8dddb5197d5eca2dd860959fb51e832e51a92f91899b05aa8236f7b"} err="failed to get container status \"466256fcf8dddb5197d5eca2dd860959fb51e832e51a92f91899b05aa8236f7b\": rpc error: code = NotFound desc = could not find container \"466256fcf8dddb5197d5eca2dd860959fb51e832e51a92f91899b05aa8236f7b\": container with ID starting with 466256fcf8dddb5197d5eca2dd860959fb51e832e51a92f91899b05aa8236f7b not found: ID does not exist" Mar 13 12:52:39 crc kubenswrapper[4837]: I0313 12:52:39.063949 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" path="/var/lib/kubelet/pods/f6c415fc-c9b8-4e4f-9116-024c9bee94b0/volumes" Mar 13 12:52:39 crc kubenswrapper[4837]: E0313 12:52:39.255676 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ba6a258_c015_4c82_b7d0_736ea4ddf3a0.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:52:40 crc kubenswrapper[4837]: I0313 12:52:40.958358 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wln6h"] Mar 13 12:52:40 crc kubenswrapper[4837]: E0313 12:52:40.959141 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" containerName="extract-utilities" Mar 13 12:52:40 crc kubenswrapper[4837]: I0313 12:52:40.959159 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" containerName="extract-utilities" Mar 13 12:52:40 crc kubenswrapper[4837]: E0313 12:52:40.959190 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" containerName="extract-content" Mar 13 12:52:40 crc kubenswrapper[4837]: I0313 12:52:40.959198 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" containerName="extract-content" Mar 13 12:52:40 crc kubenswrapper[4837]: E0313 12:52:40.959210 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" containerName="registry-server" Mar 13 12:52:40 crc kubenswrapper[4837]: I0313 12:52:40.959218 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" containerName="registry-server" Mar 13 12:52:40 crc kubenswrapper[4837]: I0313 12:52:40.959478 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" containerName="registry-server" Mar 13 12:52:40 crc kubenswrapper[4837]: I0313 12:52:40.961190 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:40 crc kubenswrapper[4837]: I0313 12:52:40.969297 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wln6h"] Mar 13 12:52:41 crc kubenswrapper[4837]: I0313 12:52:41.143620 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33a9f660-ec35-4581-bf36-1daa67adf647-utilities\") pod \"redhat-marketplace-wln6h\" (UID: \"33a9f660-ec35-4581-bf36-1daa67adf647\") " pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:41 crc kubenswrapper[4837]: I0313 12:52:41.143734 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33a9f660-ec35-4581-bf36-1daa67adf647-catalog-content\") pod \"redhat-marketplace-wln6h\" (UID: \"33a9f660-ec35-4581-bf36-1daa67adf647\") " pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:41 crc kubenswrapper[4837]: I0313 12:52:41.143835 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8crp\" (UniqueName: \"kubernetes.io/projected/33a9f660-ec35-4581-bf36-1daa67adf647-kube-api-access-x8crp\") pod \"redhat-marketplace-wln6h\" (UID: \"33a9f660-ec35-4581-bf36-1daa67adf647\") " pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:41 crc kubenswrapper[4837]: I0313 12:52:41.245333 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33a9f660-ec35-4581-bf36-1daa67adf647-utilities\") pod \"redhat-marketplace-wln6h\" (UID: \"33a9f660-ec35-4581-bf36-1daa67adf647\") " pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:41 crc kubenswrapper[4837]: I0313 12:52:41.245377 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33a9f660-ec35-4581-bf36-1daa67adf647-catalog-content\") pod \"redhat-marketplace-wln6h\" (UID: \"33a9f660-ec35-4581-bf36-1daa67adf647\") " pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:41 crc kubenswrapper[4837]: I0313 12:52:41.245437 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8crp\" (UniqueName: \"kubernetes.io/projected/33a9f660-ec35-4581-bf36-1daa67adf647-kube-api-access-x8crp\") pod \"redhat-marketplace-wln6h\" (UID: \"33a9f660-ec35-4581-bf36-1daa67adf647\") " pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:41 crc kubenswrapper[4837]: I0313 12:52:41.246175 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33a9f660-ec35-4581-bf36-1daa67adf647-utilities\") pod \"redhat-marketplace-wln6h\" (UID: \"33a9f660-ec35-4581-bf36-1daa67adf647\") " pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:41 crc kubenswrapper[4837]: I0313 12:52:41.246395 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33a9f660-ec35-4581-bf36-1daa67adf647-catalog-content\") pod \"redhat-marketplace-wln6h\" (UID: \"33a9f660-ec35-4581-bf36-1daa67adf647\") " pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:41 crc kubenswrapper[4837]: I0313 12:52:41.273856 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8crp\" (UniqueName: \"kubernetes.io/projected/33a9f660-ec35-4581-bf36-1daa67adf647-kube-api-access-x8crp\") pod \"redhat-marketplace-wln6h\" (UID: \"33a9f660-ec35-4581-bf36-1daa67adf647\") " pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:41 crc kubenswrapper[4837]: I0313 12:52:41.279282 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:41 crc kubenswrapper[4837]: I0313 12:52:41.796886 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wln6h"] Mar 13 12:52:41 crc kubenswrapper[4837]: I0313 12:52:41.862907 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wln6h" event={"ID":"33a9f660-ec35-4581-bf36-1daa67adf647","Type":"ContainerStarted","Data":"78c851a3cb8a707bbc9d75331f4c8ac4e37dc37da153ff8366f930f1e8fb0aff"} Mar 13 12:52:42 crc kubenswrapper[4837]: I0313 12:52:42.871834 4837 generic.go:334] "Generic (PLEG): container finished" podID="33a9f660-ec35-4581-bf36-1daa67adf647" containerID="3b4f16f40bd6bd84c7d2ec9b39478225f132d233a78e3209165694d1ece42356" exitCode=0 Mar 13 12:52:42 crc kubenswrapper[4837]: I0313 12:52:42.871917 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wln6h" event={"ID":"33a9f660-ec35-4581-bf36-1daa67adf647","Type":"ContainerDied","Data":"3b4f16f40bd6bd84c7d2ec9b39478225f132d233a78e3209165694d1ece42356"} Mar 13 12:52:43 crc kubenswrapper[4837]: I0313 12:52:43.883359 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wln6h" event={"ID":"33a9f660-ec35-4581-bf36-1daa67adf647","Type":"ContainerStarted","Data":"de474c833342c79cab38b58fd8149189bb30432f5e3998f127c61412ecca9b36"} Mar 13 12:52:44 crc kubenswrapper[4837]: I0313 12:52:44.892698 4837 generic.go:334] "Generic (PLEG): container finished" podID="33a9f660-ec35-4581-bf36-1daa67adf647" containerID="de474c833342c79cab38b58fd8149189bb30432f5e3998f127c61412ecca9b36" exitCode=0 Mar 13 12:52:44 crc kubenswrapper[4837]: I0313 12:52:44.892801 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wln6h" event={"ID":"33a9f660-ec35-4581-bf36-1daa67adf647","Type":"ContainerDied","Data":"de474c833342c79cab38b58fd8149189bb30432f5e3998f127c61412ecca9b36"} Mar 13 12:52:45 crc kubenswrapper[4837]: I0313 12:52:45.904218 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wln6h" event={"ID":"33a9f660-ec35-4581-bf36-1daa67adf647","Type":"ContainerStarted","Data":"f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea"} Mar 13 12:52:45 crc kubenswrapper[4837]: I0313 12:52:45.929414 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wln6h" podStartSLOduration=3.439762606 podStartE2EDuration="5.929382887s" podCreationTimestamp="2026-03-13 12:52:40 +0000 UTC" firstStartedPulling="2026-03-13 12:52:42.875584458 +0000 UTC m=+3878.513851211" lastFinishedPulling="2026-03-13 12:52:45.365204719 +0000 UTC m=+3881.003471492" observedRunningTime="2026-03-13 12:52:45.926903589 +0000 UTC m=+3881.565170382" watchObservedRunningTime="2026-03-13 12:52:45.929382887 +0000 UTC m=+3881.567649640" Mar 13 12:52:48 crc kubenswrapper[4837]: I0313 12:52:48.048697 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:52:48 crc kubenswrapper[4837]: E0313 12:52:48.049391 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:52:49 crc kubenswrapper[4837]: E0313 12:52:49.481422 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ba6a258_c015_4c82_b7d0_736ea4ddf3a0.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:52:51 crc kubenswrapper[4837]: I0313 12:52:51.279920 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:51 crc kubenswrapper[4837]: I0313 12:52:51.280448 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:51 crc kubenswrapper[4837]: I0313 12:52:51.338205 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:51 crc kubenswrapper[4837]: I0313 12:52:51.998269 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:52 crc kubenswrapper[4837]: I0313 12:52:52.058175 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wln6h"] Mar 13 12:52:52 crc kubenswrapper[4837]: I0313 12:52:52.567076 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-jrm5t_00848ba6-522a-45c7-81bd-7ab287d77626/control-plane-machine-set-operator/0.log" Mar 13 12:52:52 crc kubenswrapper[4837]: I0313 12:52:52.688867 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vsp2m_6db10103-96be-4420-b302-a7064e347f61/kube-rbac-proxy/0.log" Mar 13 12:52:52 crc kubenswrapper[4837]: I0313 12:52:52.789484 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vsp2m_6db10103-96be-4420-b302-a7064e347f61/machine-api-operator/0.log" Mar 13 12:52:53 crc kubenswrapper[4837]: I0313 12:52:53.970002 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wln6h" podUID="33a9f660-ec35-4581-bf36-1daa67adf647" containerName="registry-server" containerID="cri-o://f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea" gracePeriod=2 Mar 13 12:52:54 crc kubenswrapper[4837]: I0313 12:52:54.978773 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:54 crc kubenswrapper[4837]: I0313 12:52:54.980225 4837 generic.go:334] "Generic (PLEG): container finished" podID="33a9f660-ec35-4581-bf36-1daa67adf647" containerID="f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea" exitCode=0 Mar 13 12:52:54 crc kubenswrapper[4837]: I0313 12:52:54.980291 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wln6h" event={"ID":"33a9f660-ec35-4581-bf36-1daa67adf647","Type":"ContainerDied","Data":"f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea"} Mar 13 12:52:54 crc kubenswrapper[4837]: I0313 12:52:54.980370 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wln6h" event={"ID":"33a9f660-ec35-4581-bf36-1daa67adf647","Type":"ContainerDied","Data":"78c851a3cb8a707bbc9d75331f4c8ac4e37dc37da153ff8366f930f1e8fb0aff"} Mar 13 12:52:54 crc kubenswrapper[4837]: I0313 12:52:54.980396 4837 scope.go:117] "RemoveContainer" containerID="f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.006904 4837 scope.go:117] "RemoveContainer" containerID="de474c833342c79cab38b58fd8149189bb30432f5e3998f127c61412ecca9b36" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.056405 4837 scope.go:117] "RemoveContainer" containerID="3b4f16f40bd6bd84c7d2ec9b39478225f132d233a78e3209165694d1ece42356" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.087737 4837 scope.go:117] "RemoveContainer" containerID="f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea" Mar 13 12:52:55 crc kubenswrapper[4837]: E0313 12:52:55.088179 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea\": container with ID starting with f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea not found: ID does not exist" containerID="f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.088232 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea"} err="failed to get container status \"f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea\": rpc error: code = NotFound desc = could not find container \"f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea\": container with ID starting with f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea not found: ID does not exist" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.088266 4837 scope.go:117] "RemoveContainer" containerID="de474c833342c79cab38b58fd8149189bb30432f5e3998f127c61412ecca9b36" Mar 13 12:52:55 crc kubenswrapper[4837]: E0313 12:52:55.088704 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de474c833342c79cab38b58fd8149189bb30432f5e3998f127c61412ecca9b36\": container with ID starting with de474c833342c79cab38b58fd8149189bb30432f5e3998f127c61412ecca9b36 not found: ID does not exist" containerID="de474c833342c79cab38b58fd8149189bb30432f5e3998f127c61412ecca9b36" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.088772 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de474c833342c79cab38b58fd8149189bb30432f5e3998f127c61412ecca9b36"} err="failed to get container status \"de474c833342c79cab38b58fd8149189bb30432f5e3998f127c61412ecca9b36\": rpc error: code = NotFound desc = could not find container \"de474c833342c79cab38b58fd8149189bb30432f5e3998f127c61412ecca9b36\": container with ID starting with de474c833342c79cab38b58fd8149189bb30432f5e3998f127c61412ecca9b36 not found: ID does not exist" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.088807 4837 scope.go:117] "RemoveContainer" containerID="3b4f16f40bd6bd84c7d2ec9b39478225f132d233a78e3209165694d1ece42356" Mar 13 12:52:55 crc kubenswrapper[4837]: E0313 12:52:55.089196 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b4f16f40bd6bd84c7d2ec9b39478225f132d233a78e3209165694d1ece42356\": container with ID starting with 3b4f16f40bd6bd84c7d2ec9b39478225f132d233a78e3209165694d1ece42356 not found: ID does not exist" containerID="3b4f16f40bd6bd84c7d2ec9b39478225f132d233a78e3209165694d1ece42356" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.089238 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b4f16f40bd6bd84c7d2ec9b39478225f132d233a78e3209165694d1ece42356"} err="failed to get container status \"3b4f16f40bd6bd84c7d2ec9b39478225f132d233a78e3209165694d1ece42356\": rpc error: code = NotFound desc = could not find container \"3b4f16f40bd6bd84c7d2ec9b39478225f132d233a78e3209165694d1ece42356\": container with ID starting with 3b4f16f40bd6bd84c7d2ec9b39478225f132d233a78e3209165694d1ece42356 not found: ID does not exist" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.125509 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8crp\" (UniqueName: \"kubernetes.io/projected/33a9f660-ec35-4581-bf36-1daa67adf647-kube-api-access-x8crp\") pod \"33a9f660-ec35-4581-bf36-1daa67adf647\" (UID: \"33a9f660-ec35-4581-bf36-1daa67adf647\") " Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.125570 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33a9f660-ec35-4581-bf36-1daa67adf647-utilities\") pod \"33a9f660-ec35-4581-bf36-1daa67adf647\" (UID: \"33a9f660-ec35-4581-bf36-1daa67adf647\") " Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.125837 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33a9f660-ec35-4581-bf36-1daa67adf647-catalog-content\") pod \"33a9f660-ec35-4581-bf36-1daa67adf647\" (UID: \"33a9f660-ec35-4581-bf36-1daa67adf647\") " Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.126843 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33a9f660-ec35-4581-bf36-1daa67adf647-utilities" (OuterVolumeSpecName: "utilities") pod "33a9f660-ec35-4581-bf36-1daa67adf647" (UID: "33a9f660-ec35-4581-bf36-1daa67adf647"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.132354 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33a9f660-ec35-4581-bf36-1daa67adf647-kube-api-access-x8crp" (OuterVolumeSpecName: "kube-api-access-x8crp") pod "33a9f660-ec35-4581-bf36-1daa67adf647" (UID: "33a9f660-ec35-4581-bf36-1daa67adf647"). InnerVolumeSpecName "kube-api-access-x8crp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.162941 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33a9f660-ec35-4581-bf36-1daa67adf647-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33a9f660-ec35-4581-bf36-1daa67adf647" (UID: "33a9f660-ec35-4581-bf36-1daa67adf647"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.228210 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33a9f660-ec35-4581-bf36-1daa67adf647-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.228247 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8crp\" (UniqueName: \"kubernetes.io/projected/33a9f660-ec35-4581-bf36-1daa67adf647-kube-api-access-x8crp\") on node \"crc\" DevicePath \"\"" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.228257 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33a9f660-ec35-4581-bf36-1daa67adf647-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.989934 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:56 crc kubenswrapper[4837]: I0313 12:52:56.024070 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wln6h"] Mar 13 12:52:56 crc kubenswrapper[4837]: I0313 12:52:56.039406 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wln6h"] Mar 13 12:52:57 crc kubenswrapper[4837]: I0313 12:52:57.058686 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33a9f660-ec35-4581-bf36-1daa67adf647" path="/var/lib/kubelet/pods/33a9f660-ec35-4581-bf36-1daa67adf647/volumes" Mar 13 12:52:59 crc kubenswrapper[4837]: E0313 12:52:59.691448 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ba6a258_c015_4c82_b7d0_736ea4ddf3a0.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:53:03 crc kubenswrapper[4837]: I0313 12:53:03.048415 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:53:03 crc kubenswrapper[4837]: E0313 12:53:03.049073 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:53:05 crc kubenswrapper[4837]: I0313 12:53:05.542957 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-dlspp_5ecc1237-3421-41d5-8efb-a62399ae1d73/cert-manager-controller/0.log" Mar 13 12:53:05 crc kubenswrapper[4837]: I0313 12:53:05.722668 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-xzv5h_67507b8e-35d5-4dff-9239-45b5ef997e53/cert-manager-cainjector/0.log" Mar 13 12:53:05 crc kubenswrapper[4837]: I0313 12:53:05.791370 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-ht9vn_0e500b82-1f14-4a1e-937d-00248f195033/cert-manager-webhook/0.log" Mar 13 12:53:16 crc kubenswrapper[4837]: I0313 12:53:16.048879 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:53:16 crc kubenswrapper[4837]: E0313 12:53:16.049631 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:53:18 crc kubenswrapper[4837]: I0313 12:53:18.512453 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-fpxmr_00b31b3f-b520-493a-ad26-679e09376e81/nmstate-console-plugin/0.log" Mar 13 12:53:18 crc kubenswrapper[4837]: I0313 12:53:18.735996 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-vqqqz_ebe31727-805d-472e-89d3-e99b11435be1/nmstate-handler/0.log" Mar 13 12:53:18 crc kubenswrapper[4837]: I0313 12:53:18.769614 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-8xzdk_5d1f2d02-86ab-4679-a4e4-530ad37e4302/kube-rbac-proxy/0.log" Mar 13 12:53:18 crc kubenswrapper[4837]: I0313 12:53:18.899411 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-8xzdk_5d1f2d02-86ab-4679-a4e4-530ad37e4302/nmstate-metrics/0.log" Mar 13 12:53:18 crc kubenswrapper[4837]: I0313 12:53:18.950247 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-zf78q_ef7096b9-861a-4889-9318-535c35151777/nmstate-operator/0.log" Mar 13 12:53:19 crc kubenswrapper[4837]: I0313 12:53:19.112108 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-6cx5h_0b06c77a-f41d-41a6-b115-f12cc5109c0c/nmstate-webhook/0.log" Mar 13 12:53:28 crc kubenswrapper[4837]: I0313 12:53:28.048461 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:53:28 crc kubenswrapper[4837]: E0313 12:53:28.049179 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:53:41 crc kubenswrapper[4837]: I0313 12:53:41.048887 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:53:41 crc kubenswrapper[4837]: E0313 12:53:41.049721 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:53:44 crc kubenswrapper[4837]: I0313 12:53:44.922475 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-zm9dj_0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88/kube-rbac-proxy/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.063096 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-zm9dj_0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88/controller/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.172508 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-frr-files/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.342707 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-frr-files/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.354345 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-reloader/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.396819 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-reloader/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.430055 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-metrics/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.550663 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-frr-files/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.601213 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-reloader/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.607196 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-metrics/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.650217 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-metrics/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.792106 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-reloader/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.807130 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-frr-files/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.825995 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/controller/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.843032 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-metrics/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.975820 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/frr-metrics/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.992689 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/kube-rbac-proxy/0.log" Mar 13 12:53:46 crc kubenswrapper[4837]: I0313 12:53:46.020288 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/kube-rbac-proxy-frr/0.log" Mar 13 12:53:46 crc kubenswrapper[4837]: I0313 12:53:46.169124 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/reloader/0.log" Mar 13 12:53:46 crc kubenswrapper[4837]: I0313 12:53:46.299319 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-jwgl7_c72405c5-2c81-43f4-93c6-f73f9771be8b/frr-k8s-webhook-server/0.log" Mar 13 12:53:46 crc kubenswrapper[4837]: I0313 12:53:46.434515 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-dcfbdf95f-7x96d_41898fd8-d078-444c-bb55-33f4fb6f3dcc/manager/0.log" Mar 13 12:53:46 crc kubenswrapper[4837]: I0313 12:53:46.630467 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-59b847b88-lrvzm_eabfad13-4fe4-495d-8b6a-2da56ef3b826/webhook-server/0.log" Mar 13 12:53:46 crc kubenswrapper[4837]: I0313 12:53:46.823842 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8skdh_82a5fe00-90be-47b1-a357-69942f385d4f/kube-rbac-proxy/0.log" Mar 13 12:53:47 crc kubenswrapper[4837]: I0313 12:53:47.441337 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8skdh_82a5fe00-90be-47b1-a357-69942f385d4f/speaker/0.log" Mar 13 12:53:47 crc kubenswrapper[4837]: I0313 12:53:47.916477 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/frr/0.log" Mar 13 12:53:52 crc kubenswrapper[4837]: I0313 12:53:52.049664 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:53:52 crc kubenswrapper[4837]: E0313 12:53:52.050363 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.152332 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556774-vbb4z"] Mar 13 12:54:00 crc kubenswrapper[4837]: E0313 12:54:00.153325 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33a9f660-ec35-4581-bf36-1daa67adf647" containerName="extract-content" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.153339 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="33a9f660-ec35-4581-bf36-1daa67adf647" containerName="extract-content" Mar 13 12:54:00 crc kubenswrapper[4837]: E0313 12:54:00.153351 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33a9f660-ec35-4581-bf36-1daa67adf647" containerName="registry-server" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.153357 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="33a9f660-ec35-4581-bf36-1daa67adf647" containerName="registry-server" Mar 13 12:54:00 crc kubenswrapper[4837]: E0313 12:54:00.153365 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33a9f660-ec35-4581-bf36-1daa67adf647" containerName="extract-utilities" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.153372 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="33a9f660-ec35-4581-bf36-1daa67adf647" containerName="extract-utilities" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.153594 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="33a9f660-ec35-4581-bf36-1daa67adf647" containerName="registry-server" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.154241 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556774-vbb4z" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.156415 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.157187 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.157464 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.163352 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556774-vbb4z"] Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.279815 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k49dn\" (UniqueName: \"kubernetes.io/projected/0473d8e9-f078-403a-a76b-c5bb02c0840d-kube-api-access-k49dn\") pod \"auto-csr-approver-29556774-vbb4z\" (UID: \"0473d8e9-f078-403a-a76b-c5bb02c0840d\") " pod="openshift-infra/auto-csr-approver-29556774-vbb4z" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.381960 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k49dn\" (UniqueName: \"kubernetes.io/projected/0473d8e9-f078-403a-a76b-c5bb02c0840d-kube-api-access-k49dn\") pod \"auto-csr-approver-29556774-vbb4z\" (UID: \"0473d8e9-f078-403a-a76b-c5bb02c0840d\") " pod="openshift-infra/auto-csr-approver-29556774-vbb4z" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.403834 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k49dn\" (UniqueName: \"kubernetes.io/projected/0473d8e9-f078-403a-a76b-c5bb02c0840d-kube-api-access-k49dn\") pod \"auto-csr-approver-29556774-vbb4z\" (UID: \"0473d8e9-f078-403a-a76b-c5bb02c0840d\") " pod="openshift-infra/auto-csr-approver-29556774-vbb4z" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.476006 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556774-vbb4z" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.972804 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556774-vbb4z"] Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.984329 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:54:01 crc kubenswrapper[4837]: I0313 12:54:01.542255 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556774-vbb4z" event={"ID":"0473d8e9-f078-403a-a76b-c5bb02c0840d","Type":"ContainerStarted","Data":"d84d2372cbdd47ff9d3dd65af9962256a6b22fb77f77936dea44b769d8318cb6"} Mar 13 12:54:01 crc kubenswrapper[4837]: I0313 12:54:01.578281 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/util/0.log" Mar 13 12:54:01 crc kubenswrapper[4837]: I0313 12:54:01.743330 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/pull/0.log" Mar 13 12:54:01 crc kubenswrapper[4837]: I0313 12:54:01.746954 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/util/0.log" Mar 13 12:54:01 crc kubenswrapper[4837]: I0313 12:54:01.790503 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/pull/0.log" Mar 13 12:54:01 crc kubenswrapper[4837]: I0313 12:54:01.971584 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/pull/0.log" Mar 13 12:54:01 crc kubenswrapper[4837]: I0313 12:54:01.987184 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/extract/0.log" Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.013820 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/util/0.log" Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.199100 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/util/0.log" Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.339213 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/util/0.log" Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.387379 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/pull/0.log" Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.390337 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/pull/0.log" Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.554384 4837 generic.go:334] "Generic (PLEG): container finished" podID="0473d8e9-f078-403a-a76b-c5bb02c0840d" containerID="122fd9d8a5ad9ec96047d911c5e084e75217bd7ee019902064f096162b6ade7b" exitCode=0 Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.554452 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556774-vbb4z" event={"ID":"0473d8e9-f078-403a-a76b-c5bb02c0840d","Type":"ContainerDied","Data":"122fd9d8a5ad9ec96047d911c5e084e75217bd7ee019902064f096162b6ade7b"} Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.556574 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/pull/0.log" Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.585526 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/util/0.log" Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.586038 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/extract/0.log" Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.748985 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/extract-utilities/0.log" Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.955598 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/extract-utilities/0.log" Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.973270 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/extract-content/0.log" Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.996380 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/extract-content/0.log" Mar 13 12:54:03 crc kubenswrapper[4837]: I0313 12:54:03.186498 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/extract-content/0.log" Mar 13 12:54:03 crc kubenswrapper[4837]: I0313 12:54:03.186659 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/extract-utilities/0.log" Mar 13 12:54:03 crc kubenswrapper[4837]: I0313 12:54:03.399515 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/extract-utilities/0.log" Mar 13 12:54:03 crc kubenswrapper[4837]: I0313 12:54:03.570861 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/extract-utilities/0.log" Mar 13 12:54:03 crc kubenswrapper[4837]: I0313 12:54:03.601681 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/extract-content/0.log" Mar 13 12:54:03 crc kubenswrapper[4837]: I0313 12:54:03.612773 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/extract-content/0.log" Mar 13 12:54:03 crc kubenswrapper[4837]: I0313 12:54:03.753481 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/registry-server/0.log" Mar 13 12:54:03 crc kubenswrapper[4837]: I0313 12:54:03.841419 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/extract-content/0.log" Mar 13 12:54:03 crc kubenswrapper[4837]: I0313 12:54:03.854710 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/extract-utilities/0.log" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.060273 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556774-vbb4z" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.150368 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-7rzpc_b87c8f86-a346-4907-9441-048c3220646f/marketplace-operator/0.log" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.173561 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k49dn\" (UniqueName: \"kubernetes.io/projected/0473d8e9-f078-403a-a76b-c5bb02c0840d-kube-api-access-k49dn\") pod \"0473d8e9-f078-403a-a76b-c5bb02c0840d\" (UID: \"0473d8e9-f078-403a-a76b-c5bb02c0840d\") " Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.182895 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0473d8e9-f078-403a-a76b-c5bb02c0840d-kube-api-access-k49dn" (OuterVolumeSpecName: "kube-api-access-k49dn") pod "0473d8e9-f078-403a-a76b-c5bb02c0840d" (UID: "0473d8e9-f078-403a-a76b-c5bb02c0840d"). InnerVolumeSpecName "kube-api-access-k49dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.275510 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k49dn\" (UniqueName: \"kubernetes.io/projected/0473d8e9-f078-403a-a76b-c5bb02c0840d-kube-api-access-k49dn\") on node \"crc\" DevicePath \"\"" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.286470 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/extract-utilities/0.log" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.435527 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/registry-server/0.log" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.439598 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/extract-content/0.log" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.501747 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/extract-utilities/0.log" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.567319 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/extract-content/0.log" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.571913 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556774-vbb4z" event={"ID":"0473d8e9-f078-403a-a76b-c5bb02c0840d","Type":"ContainerDied","Data":"d84d2372cbdd47ff9d3dd65af9962256a6b22fb77f77936dea44b769d8318cb6"} Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.571960 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d84d2372cbdd47ff9d3dd65af9962256a6b22fb77f77936dea44b769d8318cb6" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.572892 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556774-vbb4z" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.693996 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/extract-utilities/0.log" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.740058 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/extract-content/0.log" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.820943 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/registry-server/0.log" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.962899 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/extract-utilities/0.log" Mar 13 12:54:05 crc kubenswrapper[4837]: I0313 12:54:05.057341 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:54:05 crc kubenswrapper[4837]: E0313 12:54:05.057586 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:54:05 crc kubenswrapper[4837]: I0313 12:54:05.135212 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556768-f66sk"] Mar 13 12:54:05 crc kubenswrapper[4837]: I0313 12:54:05.136166 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/extract-utilities/0.log" Mar 13 12:54:05 crc kubenswrapper[4837]: I0313 12:54:05.146822 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556768-f66sk"] Mar 13 12:54:05 crc kubenswrapper[4837]: I0313 12:54:05.169548 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/extract-content/0.log" Mar 13 12:54:05 crc kubenswrapper[4837]: I0313 12:54:05.197912 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/extract-content/0.log" Mar 13 12:54:05 crc kubenswrapper[4837]: I0313 12:54:05.369599 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/extract-utilities/0.log" Mar 13 12:54:05 crc kubenswrapper[4837]: I0313 12:54:05.384411 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/extract-content/0.log" Mar 13 12:54:05 crc kubenswrapper[4837]: I0313 12:54:05.886744 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/registry-server/0.log" Mar 13 12:54:07 crc kubenswrapper[4837]: I0313 12:54:07.089213 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b297ac1-71ba-4b15-b915-a38f9da4ebb7" path="/var/lib/kubelet/pods/1b297ac1-71ba-4b15-b915-a38f9da4ebb7/volumes" Mar 13 12:54:20 crc kubenswrapper[4837]: I0313 12:54:20.048817 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:54:20 crc kubenswrapper[4837]: I0313 12:54:20.728910 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"29354e15c5b23a19853789916f8ed484d322f379033f92e55fb369856d2b8dbb"} Mar 13 12:54:33 crc kubenswrapper[4837]: I0313 12:54:33.245021 4837 scope.go:117] "RemoveContainer" containerID="6da52e600ecb49afa497ca1fed54ebec9623af66e73a4cbe5e0c9804569c398b" Mar 13 12:54:37 crc kubenswrapper[4837]: E0313 12:54:37.140818 4837 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.138:35258->38.102.83.138:43005: write tcp 38.102.83.138:35258->38.102.83.138:43005: write: broken pipe Mar 13 12:55:45 crc kubenswrapper[4837]: I0313 12:55:45.871738 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rhlvr"] Mar 13 12:55:45 crc kubenswrapper[4837]: E0313 12:55:45.873033 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0473d8e9-f078-403a-a76b-c5bb02c0840d" containerName="oc" Mar 13 12:55:45 crc kubenswrapper[4837]: I0313 12:55:45.873052 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0473d8e9-f078-403a-a76b-c5bb02c0840d" containerName="oc" Mar 13 12:55:45 crc kubenswrapper[4837]: I0313 12:55:45.873311 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0473d8e9-f078-403a-a76b-c5bb02c0840d" containerName="oc" Mar 13 12:55:45 crc kubenswrapper[4837]: I0313 12:55:45.874992 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:45 crc kubenswrapper[4837]: I0313 12:55:45.904165 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rhlvr"] Mar 13 12:55:45 crc kubenswrapper[4837]: I0313 12:55:45.990280 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b9fb61f-c188-4889-a632-b3e0e4807ced-catalog-content\") pod \"community-operators-rhlvr\" (UID: \"1b9fb61f-c188-4889-a632-b3e0e4807ced\") " pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:45 crc kubenswrapper[4837]: I0313 12:55:45.990356 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b9fb61f-c188-4889-a632-b3e0e4807ced-utilities\") pod \"community-operators-rhlvr\" (UID: \"1b9fb61f-c188-4889-a632-b3e0e4807ced\") " pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:45 crc kubenswrapper[4837]: I0313 12:55:45.990692 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgp98\" (UniqueName: \"kubernetes.io/projected/1b9fb61f-c188-4889-a632-b3e0e4807ced-kube-api-access-lgp98\") pod \"community-operators-rhlvr\" (UID: \"1b9fb61f-c188-4889-a632-b3e0e4807ced\") " pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:46 crc kubenswrapper[4837]: I0313 12:55:46.092538 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgp98\" (UniqueName: \"kubernetes.io/projected/1b9fb61f-c188-4889-a632-b3e0e4807ced-kube-api-access-lgp98\") pod \"community-operators-rhlvr\" (UID: \"1b9fb61f-c188-4889-a632-b3e0e4807ced\") " pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:46 crc kubenswrapper[4837]: I0313 12:55:46.092675 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b9fb61f-c188-4889-a632-b3e0e4807ced-catalog-content\") pod \"community-operators-rhlvr\" (UID: \"1b9fb61f-c188-4889-a632-b3e0e4807ced\") " pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:46 crc kubenswrapper[4837]: I0313 12:55:46.092721 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b9fb61f-c188-4889-a632-b3e0e4807ced-utilities\") pod \"community-operators-rhlvr\" (UID: \"1b9fb61f-c188-4889-a632-b3e0e4807ced\") " pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:46 crc kubenswrapper[4837]: I0313 12:55:46.093272 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b9fb61f-c188-4889-a632-b3e0e4807ced-utilities\") pod \"community-operators-rhlvr\" (UID: \"1b9fb61f-c188-4889-a632-b3e0e4807ced\") " pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:46 crc kubenswrapper[4837]: I0313 12:55:46.093409 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b9fb61f-c188-4889-a632-b3e0e4807ced-catalog-content\") pod \"community-operators-rhlvr\" (UID: \"1b9fb61f-c188-4889-a632-b3e0e4807ced\") " pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:46 crc kubenswrapper[4837]: I0313 12:55:46.622498 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgp98\" (UniqueName: \"kubernetes.io/projected/1b9fb61f-c188-4889-a632-b3e0e4807ced-kube-api-access-lgp98\") pod \"community-operators-rhlvr\" (UID: \"1b9fb61f-c188-4889-a632-b3e0e4807ced\") " pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:46 crc kubenswrapper[4837]: I0313 12:55:46.802278 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:47 crc kubenswrapper[4837]: I0313 12:55:47.273953 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rhlvr"] Mar 13 12:55:47 crc kubenswrapper[4837]: I0313 12:55:47.641415 4837 generic.go:334] "Generic (PLEG): container finished" podID="1b9fb61f-c188-4889-a632-b3e0e4807ced" containerID="b7bf1cee88f3eaca26285ce714fdc8c7b26a974cc8bd89f746a3db6e78eba902" exitCode=0 Mar 13 12:55:47 crc kubenswrapper[4837]: I0313 12:55:47.641463 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhlvr" event={"ID":"1b9fb61f-c188-4889-a632-b3e0e4807ced","Type":"ContainerDied","Data":"b7bf1cee88f3eaca26285ce714fdc8c7b26a974cc8bd89f746a3db6e78eba902"} Mar 13 12:55:47 crc kubenswrapper[4837]: I0313 12:55:47.641491 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhlvr" event={"ID":"1b9fb61f-c188-4889-a632-b3e0e4807ced","Type":"ContainerStarted","Data":"76392729e1c1144daa565ff3166e05881bf3e6a854cfffe6c7fa94d800f4c807"} Mar 13 12:55:48 crc kubenswrapper[4837]: I0313 12:55:48.651027 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhlvr" event={"ID":"1b9fb61f-c188-4889-a632-b3e0e4807ced","Type":"ContainerStarted","Data":"40ba339fa75a80c8cfd85992b52dd9d9f36065f16925c616d3a71c9c95bd8fd1"} Mar 13 12:55:50 crc kubenswrapper[4837]: I0313 12:55:50.676469 4837 generic.go:334] "Generic (PLEG): container finished" podID="1b9fb61f-c188-4889-a632-b3e0e4807ced" containerID="40ba339fa75a80c8cfd85992b52dd9d9f36065f16925c616d3a71c9c95bd8fd1" exitCode=0 Mar 13 12:55:50 crc kubenswrapper[4837]: I0313 12:55:50.676554 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhlvr" event={"ID":"1b9fb61f-c188-4889-a632-b3e0e4807ced","Type":"ContainerDied","Data":"40ba339fa75a80c8cfd85992b52dd9d9f36065f16925c616d3a71c9c95bd8fd1"} Mar 13 12:55:51 crc kubenswrapper[4837]: I0313 12:55:51.688793 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhlvr" event={"ID":"1b9fb61f-c188-4889-a632-b3e0e4807ced","Type":"ContainerStarted","Data":"fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b"} Mar 13 12:55:51 crc kubenswrapper[4837]: I0313 12:55:51.710231 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rhlvr" podStartSLOduration=3.249867477 podStartE2EDuration="6.710203664s" podCreationTimestamp="2026-03-13 12:55:45 +0000 UTC" firstStartedPulling="2026-03-13 12:55:47.64371904 +0000 UTC m=+4063.281985813" lastFinishedPulling="2026-03-13 12:55:51.104055237 +0000 UTC m=+4066.742322000" observedRunningTime="2026-03-13 12:55:51.704623059 +0000 UTC m=+4067.342889822" watchObservedRunningTime="2026-03-13 12:55:51.710203664 +0000 UTC m=+4067.348470427" Mar 13 12:55:56 crc kubenswrapper[4837]: I0313 12:55:56.803616 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:56 crc kubenswrapper[4837]: I0313 12:55:56.804145 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:56 crc kubenswrapper[4837]: I0313 12:55:56.851577 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:57 crc kubenswrapper[4837]: I0313 12:55:57.761214 4837 generic.go:334] "Generic (PLEG): container finished" podID="130c1c0e-31b1-415d-aab2-fab358576a73" containerID="433a139fea2255c45e8580415a3deca8258493b41f46198b67c0eac345fb5a75" exitCode=0 Mar 13 12:55:57 crc kubenswrapper[4837]: I0313 12:55:57.761417 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lkqv5/must-gather-vz7zz" event={"ID":"130c1c0e-31b1-415d-aab2-fab358576a73","Type":"ContainerDied","Data":"433a139fea2255c45e8580415a3deca8258493b41f46198b67c0eac345fb5a75"} Mar 13 12:55:57 crc kubenswrapper[4837]: I0313 12:55:57.763376 4837 scope.go:117] "RemoveContainer" containerID="433a139fea2255c45e8580415a3deca8258493b41f46198b67c0eac345fb5a75" Mar 13 12:55:57 crc kubenswrapper[4837]: I0313 12:55:57.824369 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:57 crc kubenswrapper[4837]: I0313 12:55:57.871945 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rhlvr"] Mar 13 12:55:58 crc kubenswrapper[4837]: I0313 12:55:58.414684 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lkqv5_must-gather-vz7zz_130c1c0e-31b1-415d-aab2-fab358576a73/gather/0.log" Mar 13 12:55:59 crc kubenswrapper[4837]: I0313 12:55:59.778460 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rhlvr" podUID="1b9fb61f-c188-4889-a632-b3e0e4807ced" containerName="registry-server" containerID="cri-o://fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b" gracePeriod=2 Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.146198 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556776-t6fxv"] Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.151117 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556776-t6fxv" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.155057 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.155217 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.155408 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.157219 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556776-t6fxv"] Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.157761 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwg5x\" (UniqueName: \"kubernetes.io/projected/d64a81f2-3643-4e57-8322-c09c0360d46b-kube-api-access-jwg5x\") pod \"auto-csr-approver-29556776-t6fxv\" (UID: \"d64a81f2-3643-4e57-8322-c09c0360d46b\") " pod="openshift-infra/auto-csr-approver-29556776-t6fxv" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.259059 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwg5x\" (UniqueName: \"kubernetes.io/projected/d64a81f2-3643-4e57-8322-c09c0360d46b-kube-api-access-jwg5x\") pod \"auto-csr-approver-29556776-t6fxv\" (UID: \"d64a81f2-3643-4e57-8322-c09c0360d46b\") " pod="openshift-infra/auto-csr-approver-29556776-t6fxv" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.270126 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.286312 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwg5x\" (UniqueName: \"kubernetes.io/projected/d64a81f2-3643-4e57-8322-c09c0360d46b-kube-api-access-jwg5x\") pod \"auto-csr-approver-29556776-t6fxv\" (UID: \"d64a81f2-3643-4e57-8322-c09c0360d46b\") " pod="openshift-infra/auto-csr-approver-29556776-t6fxv" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.364991 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b9fb61f-c188-4889-a632-b3e0e4807ced-catalog-content\") pod \"1b9fb61f-c188-4889-a632-b3e0e4807ced\" (UID: \"1b9fb61f-c188-4889-a632-b3e0e4807ced\") " Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.365097 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgp98\" (UniqueName: \"kubernetes.io/projected/1b9fb61f-c188-4889-a632-b3e0e4807ced-kube-api-access-lgp98\") pod \"1b9fb61f-c188-4889-a632-b3e0e4807ced\" (UID: \"1b9fb61f-c188-4889-a632-b3e0e4807ced\") " Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.365186 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b9fb61f-c188-4889-a632-b3e0e4807ced-utilities\") pod \"1b9fb61f-c188-4889-a632-b3e0e4807ced\" (UID: \"1b9fb61f-c188-4889-a632-b3e0e4807ced\") " Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.366571 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b9fb61f-c188-4889-a632-b3e0e4807ced-utilities" (OuterVolumeSpecName: "utilities") pod "1b9fb61f-c188-4889-a632-b3e0e4807ced" (UID: "1b9fb61f-c188-4889-a632-b3e0e4807ced"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.370010 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b9fb61f-c188-4889-a632-b3e0e4807ced-kube-api-access-lgp98" (OuterVolumeSpecName: "kube-api-access-lgp98") pod "1b9fb61f-c188-4889-a632-b3e0e4807ced" (UID: "1b9fb61f-c188-4889-a632-b3e0e4807ced"). InnerVolumeSpecName "kube-api-access-lgp98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.447749 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b9fb61f-c188-4889-a632-b3e0e4807ced-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b9fb61f-c188-4889-a632-b3e0e4807ced" (UID: "1b9fb61f-c188-4889-a632-b3e0e4807ced"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.468092 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b9fb61f-c188-4889-a632-b3e0e4807ced-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.468131 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgp98\" (UniqueName: \"kubernetes.io/projected/1b9fb61f-c188-4889-a632-b3e0e4807ced-kube-api-access-lgp98\") on node \"crc\" DevicePath \"\"" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.468149 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b9fb61f-c188-4889-a632-b3e0e4807ced-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.559688 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556776-t6fxv" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.795742 4837 generic.go:334] "Generic (PLEG): container finished" podID="1b9fb61f-c188-4889-a632-b3e0e4807ced" containerID="fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b" exitCode=0 Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.795813 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhlvr" event={"ID":"1b9fb61f-c188-4889-a632-b3e0e4807ced","Type":"ContainerDied","Data":"fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b"} Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.795854 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhlvr" event={"ID":"1b9fb61f-c188-4889-a632-b3e0e4807ced","Type":"ContainerDied","Data":"76392729e1c1144daa565ff3166e05881bf3e6a854cfffe6c7fa94d800f4c807"} Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.795876 4837 scope.go:117] "RemoveContainer" containerID="fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.796052 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.837781 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rhlvr"] Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.845152 4837 scope.go:117] "RemoveContainer" containerID="40ba339fa75a80c8cfd85992b52dd9d9f36065f16925c616d3a71c9c95bd8fd1" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.848504 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rhlvr"] Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.865606 4837 scope.go:117] "RemoveContainer" containerID="b7bf1cee88f3eaca26285ce714fdc8c7b26a974cc8bd89f746a3db6e78eba902" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.882957 4837 scope.go:117] "RemoveContainer" containerID="fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b" Mar 13 12:56:00 crc kubenswrapper[4837]: E0313 12:56:00.887913 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b\": container with ID starting with fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b not found: ID does not exist" containerID="fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.887973 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b"} err="failed to get container status \"fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b\": rpc error: code = NotFound desc = could not find container \"fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b\": container with ID starting with fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b not found: ID does not exist" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.888250 4837 scope.go:117] "RemoveContainer" containerID="40ba339fa75a80c8cfd85992b52dd9d9f36065f16925c616d3a71c9c95bd8fd1" Mar 13 12:56:00 crc kubenswrapper[4837]: E0313 12:56:00.888695 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40ba339fa75a80c8cfd85992b52dd9d9f36065f16925c616d3a71c9c95bd8fd1\": container with ID starting with 40ba339fa75a80c8cfd85992b52dd9d9f36065f16925c616d3a71c9c95bd8fd1 not found: ID does not exist" containerID="40ba339fa75a80c8cfd85992b52dd9d9f36065f16925c616d3a71c9c95bd8fd1" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.888719 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40ba339fa75a80c8cfd85992b52dd9d9f36065f16925c616d3a71c9c95bd8fd1"} err="failed to get container status \"40ba339fa75a80c8cfd85992b52dd9d9f36065f16925c616d3a71c9c95bd8fd1\": rpc error: code = NotFound desc = could not find container \"40ba339fa75a80c8cfd85992b52dd9d9f36065f16925c616d3a71c9c95bd8fd1\": container with ID starting with 40ba339fa75a80c8cfd85992b52dd9d9f36065f16925c616d3a71c9c95bd8fd1 not found: ID does not exist" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.888735 4837 scope.go:117] "RemoveContainer" containerID="b7bf1cee88f3eaca26285ce714fdc8c7b26a974cc8bd89f746a3db6e78eba902" Mar 13 12:56:00 crc kubenswrapper[4837]: E0313 12:56:00.889020 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7bf1cee88f3eaca26285ce714fdc8c7b26a974cc8bd89f746a3db6e78eba902\": container with ID starting with b7bf1cee88f3eaca26285ce714fdc8c7b26a974cc8bd89f746a3db6e78eba902 not found: ID does not exist" containerID="b7bf1cee88f3eaca26285ce714fdc8c7b26a974cc8bd89f746a3db6e78eba902" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.889066 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7bf1cee88f3eaca26285ce714fdc8c7b26a974cc8bd89f746a3db6e78eba902"} err="failed to get container status \"b7bf1cee88f3eaca26285ce714fdc8c7b26a974cc8bd89f746a3db6e78eba902\": rpc error: code = NotFound desc = could not find container \"b7bf1cee88f3eaca26285ce714fdc8c7b26a974cc8bd89f746a3db6e78eba902\": container with ID starting with b7bf1cee88f3eaca26285ce714fdc8c7b26a974cc8bd89f746a3db6e78eba902 not found: ID does not exist" Mar 13 12:56:01 crc kubenswrapper[4837]: I0313 12:56:01.012023 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556776-t6fxv"] Mar 13 12:56:01 crc kubenswrapper[4837]: I0313 12:56:01.060885 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b9fb61f-c188-4889-a632-b3e0e4807ced" path="/var/lib/kubelet/pods/1b9fb61f-c188-4889-a632-b3e0e4807ced/volumes" Mar 13 12:56:01 crc kubenswrapper[4837]: I0313 12:56:01.806967 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556776-t6fxv" event={"ID":"d64a81f2-3643-4e57-8322-c09c0360d46b","Type":"ContainerStarted","Data":"d96b73b2fd795c760721a9677f17c0f047b16fbbb6085f7694336b1ec8d78177"} Mar 13 12:56:02 crc kubenswrapper[4837]: I0313 12:56:02.820446 4837 generic.go:334] "Generic (PLEG): container finished" podID="d64a81f2-3643-4e57-8322-c09c0360d46b" containerID="1086c80585379365c2bff27c51b687e139e8eb2c034f632dcdf9de8104b8d107" exitCode=0 Mar 13 12:56:02 crc kubenswrapper[4837]: I0313 12:56:02.820587 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556776-t6fxv" event={"ID":"d64a81f2-3643-4e57-8322-c09c0360d46b","Type":"ContainerDied","Data":"1086c80585379365c2bff27c51b687e139e8eb2c034f632dcdf9de8104b8d107"} Mar 13 12:56:04 crc kubenswrapper[4837]: I0313 12:56:04.168847 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556776-t6fxv" Mar 13 12:56:04 crc kubenswrapper[4837]: I0313 12:56:04.341652 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwg5x\" (UniqueName: \"kubernetes.io/projected/d64a81f2-3643-4e57-8322-c09c0360d46b-kube-api-access-jwg5x\") pod \"d64a81f2-3643-4e57-8322-c09c0360d46b\" (UID: \"d64a81f2-3643-4e57-8322-c09c0360d46b\") " Mar 13 12:56:04 crc kubenswrapper[4837]: I0313 12:56:04.349072 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d64a81f2-3643-4e57-8322-c09c0360d46b-kube-api-access-jwg5x" (OuterVolumeSpecName: "kube-api-access-jwg5x") pod "d64a81f2-3643-4e57-8322-c09c0360d46b" (UID: "d64a81f2-3643-4e57-8322-c09c0360d46b"). InnerVolumeSpecName "kube-api-access-jwg5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:56:04 crc kubenswrapper[4837]: I0313 12:56:04.444144 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwg5x\" (UniqueName: \"kubernetes.io/projected/d64a81f2-3643-4e57-8322-c09c0360d46b-kube-api-access-jwg5x\") on node \"crc\" DevicePath \"\"" Mar 13 12:56:04 crc kubenswrapper[4837]: I0313 12:56:04.837344 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556776-t6fxv" event={"ID":"d64a81f2-3643-4e57-8322-c09c0360d46b","Type":"ContainerDied","Data":"d96b73b2fd795c760721a9677f17c0f047b16fbbb6085f7694336b1ec8d78177"} Mar 13 12:56:04 crc kubenswrapper[4837]: I0313 12:56:04.837613 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d96b73b2fd795c760721a9677f17c0f047b16fbbb6085f7694336b1ec8d78177" Mar 13 12:56:04 crc kubenswrapper[4837]: I0313 12:56:04.837410 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556776-t6fxv" Mar 13 12:56:05 crc kubenswrapper[4837]: I0313 12:56:05.229944 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556770-5vjcn"] Mar 13 12:56:05 crc kubenswrapper[4837]: I0313 12:56:05.241321 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556770-5vjcn"] Mar 13 12:56:07 crc kubenswrapper[4837]: I0313 12:56:07.066791 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39e3042e-9415-4734-bfa5-8def0b858b6e" path="/var/lib/kubelet/pods/39e3042e-9415-4734-bfa5-8def0b858b6e/volumes" Mar 13 12:56:10 crc kubenswrapper[4837]: I0313 12:56:10.250384 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lkqv5/must-gather-vz7zz"] Mar 13 12:56:10 crc kubenswrapper[4837]: I0313 12:56:10.251283 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-lkqv5/must-gather-vz7zz" podUID="130c1c0e-31b1-415d-aab2-fab358576a73" containerName="copy" containerID="cri-o://bc010a3c2a92443b50c947cd27f9323f2921ea8aae80c058217be8b624f5d427" gracePeriod=2 Mar 13 12:56:10 crc kubenswrapper[4837]: I0313 12:56:10.263938 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lkqv5/must-gather-vz7zz"] Mar 13 12:56:10 crc kubenswrapper[4837]: I0313 12:56:10.887833 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lkqv5_must-gather-vz7zz_130c1c0e-31b1-415d-aab2-fab358576a73/copy/0.log" Mar 13 12:56:10 crc kubenswrapper[4837]: I0313 12:56:10.888188 4837 generic.go:334] "Generic (PLEG): container finished" podID="130c1c0e-31b1-415d-aab2-fab358576a73" containerID="bc010a3c2a92443b50c947cd27f9323f2921ea8aae80c058217be8b624f5d427" exitCode=143 Mar 13 12:56:11 crc kubenswrapper[4837]: I0313 12:56:11.169826 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lkqv5_must-gather-vz7zz_130c1c0e-31b1-415d-aab2-fab358576a73/copy/0.log" Mar 13 12:56:11 crc kubenswrapper[4837]: I0313 12:56:11.170515 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/must-gather-vz7zz" Mar 13 12:56:11 crc kubenswrapper[4837]: I0313 12:56:11.203661 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/130c1c0e-31b1-415d-aab2-fab358576a73-must-gather-output\") pod \"130c1c0e-31b1-415d-aab2-fab358576a73\" (UID: \"130c1c0e-31b1-415d-aab2-fab358576a73\") " Mar 13 12:56:11 crc kubenswrapper[4837]: I0313 12:56:11.203730 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzd72\" (UniqueName: \"kubernetes.io/projected/130c1c0e-31b1-415d-aab2-fab358576a73-kube-api-access-tzd72\") pod \"130c1c0e-31b1-415d-aab2-fab358576a73\" (UID: \"130c1c0e-31b1-415d-aab2-fab358576a73\") " Mar 13 12:56:11 crc kubenswrapper[4837]: I0313 12:56:11.209064 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/130c1c0e-31b1-415d-aab2-fab358576a73-kube-api-access-tzd72" (OuterVolumeSpecName: "kube-api-access-tzd72") pod "130c1c0e-31b1-415d-aab2-fab358576a73" (UID: "130c1c0e-31b1-415d-aab2-fab358576a73"). InnerVolumeSpecName "kube-api-access-tzd72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:56:11 crc kubenswrapper[4837]: I0313 12:56:11.305958 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzd72\" (UniqueName: \"kubernetes.io/projected/130c1c0e-31b1-415d-aab2-fab358576a73-kube-api-access-tzd72\") on node \"crc\" DevicePath \"\"" Mar 13 12:56:11 crc kubenswrapper[4837]: I0313 12:56:11.371987 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/130c1c0e-31b1-415d-aab2-fab358576a73-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "130c1c0e-31b1-415d-aab2-fab358576a73" (UID: "130c1c0e-31b1-415d-aab2-fab358576a73"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:56:11 crc kubenswrapper[4837]: I0313 12:56:11.408285 4837 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/130c1c0e-31b1-415d-aab2-fab358576a73-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 13 12:56:11 crc kubenswrapper[4837]: I0313 12:56:11.897399 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lkqv5_must-gather-vz7zz_130c1c0e-31b1-415d-aab2-fab358576a73/copy/0.log" Mar 13 12:56:11 crc kubenswrapper[4837]: I0313 12:56:11.897919 4837 scope.go:117] "RemoveContainer" containerID="bc010a3c2a92443b50c947cd27f9323f2921ea8aae80c058217be8b624f5d427" Mar 13 12:56:11 crc kubenswrapper[4837]: I0313 12:56:11.897938 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/must-gather-vz7zz" Mar 13 12:56:11 crc kubenswrapper[4837]: I0313 12:56:11.921507 4837 scope.go:117] "RemoveContainer" containerID="433a139fea2255c45e8580415a3deca8258493b41f46198b67c0eac345fb5a75" Mar 13 12:56:13 crc kubenswrapper[4837]: I0313 12:56:13.059945 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="130c1c0e-31b1-415d-aab2-fab358576a73" path="/var/lib/kubelet/pods/130c1c0e-31b1-415d-aab2-fab358576a73/volumes" Mar 13 12:56:33 crc kubenswrapper[4837]: I0313 12:56:33.341550 4837 scope.go:117] "RemoveContainer" containerID="8a03a622bd1e0141b38071e7ff2bc9ecddb0162408970736756d5805f18fdf44" Mar 13 12:56:35 crc kubenswrapper[4837]: I0313 12:56:35.483409 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:56:35 crc kubenswrapper[4837]: I0313 12:56:35.483905 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:57:05 crc kubenswrapper[4837]: I0313 12:57:05.483965 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:57:05 crc kubenswrapper[4837]: I0313 12:57:05.484380 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:57:35 crc kubenswrapper[4837]: I0313 12:57:35.484215 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:57:35 crc kubenswrapper[4837]: I0313 12:57:35.484767 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:57:35 crc kubenswrapper[4837]: I0313 12:57:35.484814 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 12:57:35 crc kubenswrapper[4837]: I0313 12:57:35.485485 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"29354e15c5b23a19853789916f8ed484d322f379033f92e55fb369856d2b8dbb"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:57:35 crc kubenswrapper[4837]: I0313 12:57:35.485531 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://29354e15c5b23a19853789916f8ed484d322f379033f92e55fb369856d2b8dbb" gracePeriod=600 Mar 13 12:57:35 crc kubenswrapper[4837]: I0313 12:57:35.855973 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="29354e15c5b23a19853789916f8ed484d322f379033f92e55fb369856d2b8dbb" exitCode=0 Mar 13 12:57:35 crc kubenswrapper[4837]: I0313 12:57:35.856020 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"29354e15c5b23a19853789916f8ed484d322f379033f92e55fb369856d2b8dbb"} Mar 13 12:57:35 crc kubenswrapper[4837]: I0313 12:57:35.856240 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"d587321d51a1203fc11d365d3e6abfaf3ed8b51e170ae3c59f3a432ec954d9de"} Mar 13 12:57:35 crc kubenswrapper[4837]: I0313 12:57:35.856260 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.140282 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556778-pbsh7"] Mar 13 12:58:00 crc kubenswrapper[4837]: E0313 12:58:00.142662 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b9fb61f-c188-4889-a632-b3e0e4807ced" containerName="registry-server" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.142786 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9fb61f-c188-4889-a632-b3e0e4807ced" containerName="registry-server" Mar 13 12:58:00 crc kubenswrapper[4837]: E0313 12:58:00.142873 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b9fb61f-c188-4889-a632-b3e0e4807ced" containerName="extract-utilities" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.142951 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9fb61f-c188-4889-a632-b3e0e4807ced" containerName="extract-utilities" Mar 13 12:58:00 crc kubenswrapper[4837]: E0313 12:58:00.143043 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="130c1c0e-31b1-415d-aab2-fab358576a73" containerName="gather" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.143146 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="130c1c0e-31b1-415d-aab2-fab358576a73" containerName="gather" Mar 13 12:58:00 crc kubenswrapper[4837]: E0313 12:58:00.143232 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b9fb61f-c188-4889-a632-b3e0e4807ced" containerName="extract-content" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.143314 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9fb61f-c188-4889-a632-b3e0e4807ced" containerName="extract-content" Mar 13 12:58:00 crc kubenswrapper[4837]: E0313 12:58:00.143405 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d64a81f2-3643-4e57-8322-c09c0360d46b" containerName="oc" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.143484 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d64a81f2-3643-4e57-8322-c09c0360d46b" containerName="oc" Mar 13 12:58:00 crc kubenswrapper[4837]: E0313 12:58:00.143582 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="130c1c0e-31b1-415d-aab2-fab358576a73" containerName="copy" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.143675 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="130c1c0e-31b1-415d-aab2-fab358576a73" containerName="copy" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.143977 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b9fb61f-c188-4889-a632-b3e0e4807ced" containerName="registry-server" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.144085 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="130c1c0e-31b1-415d-aab2-fab358576a73" containerName="copy" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.144177 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="d64a81f2-3643-4e57-8322-c09c0360d46b" containerName="oc" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.144259 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="130c1c0e-31b1-415d-aab2-fab358576a73" containerName="gather" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.145086 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556778-pbsh7" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.147371 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.148585 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.148610 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.152552 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556778-pbsh7"] Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.292821 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv4qx\" (UniqueName: \"kubernetes.io/projected/3f053d99-932e-4b5e-812a-6f58a6580bef-kube-api-access-qv4qx\") pod \"auto-csr-approver-29556778-pbsh7\" (UID: \"3f053d99-932e-4b5e-812a-6f58a6580bef\") " pod="openshift-infra/auto-csr-approver-29556778-pbsh7" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.394198 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv4qx\" (UniqueName: \"kubernetes.io/projected/3f053d99-932e-4b5e-812a-6f58a6580bef-kube-api-access-qv4qx\") pod \"auto-csr-approver-29556778-pbsh7\" (UID: \"3f053d99-932e-4b5e-812a-6f58a6580bef\") " pod="openshift-infra/auto-csr-approver-29556778-pbsh7" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.413585 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv4qx\" (UniqueName: \"kubernetes.io/projected/3f053d99-932e-4b5e-812a-6f58a6580bef-kube-api-access-qv4qx\") pod \"auto-csr-approver-29556778-pbsh7\" (UID: \"3f053d99-932e-4b5e-812a-6f58a6580bef\") " pod="openshift-infra/auto-csr-approver-29556778-pbsh7" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.462419 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556778-pbsh7" Mar 13 12:58:00 crc kubenswrapper[4837]: W0313 12:58:00.905758 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f053d99_932e_4b5e_812a_6f58a6580bef.slice/crio-a7aee66ea0867e20081ad9cc8dd0c99b97ad869ab35b6a16cece45cd867b13aa WatchSource:0}: Error finding container a7aee66ea0867e20081ad9cc8dd0c99b97ad869ab35b6a16cece45cd867b13aa: Status 404 returned error can't find the container with id a7aee66ea0867e20081ad9cc8dd0c99b97ad869ab35b6a16cece45cd867b13aa Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.915258 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556778-pbsh7"] Mar 13 12:58:01 crc kubenswrapper[4837]: I0313 12:58:01.091528 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556778-pbsh7" event={"ID":"3f053d99-932e-4b5e-812a-6f58a6580bef","Type":"ContainerStarted","Data":"a7aee66ea0867e20081ad9cc8dd0c99b97ad869ab35b6a16cece45cd867b13aa"} Mar 13 12:58:02 crc kubenswrapper[4837]: I0313 12:58:02.100905 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556778-pbsh7" event={"ID":"3f053d99-932e-4b5e-812a-6f58a6580bef","Type":"ContainerStarted","Data":"7f19f392263b0041f3e39b1838209aac48dd556dd18ed9e53e8ba99d4a96b20e"} Mar 13 12:58:02 crc kubenswrapper[4837]: I0313 12:58:02.122996 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556778-pbsh7" podStartSLOduration=1.321761548 podStartE2EDuration="2.122977652s" podCreationTimestamp="2026-03-13 12:58:00 +0000 UTC" firstStartedPulling="2026-03-13 12:58:00.907841349 +0000 UTC m=+4196.546108112" lastFinishedPulling="2026-03-13 12:58:01.709057453 +0000 UTC m=+4197.347324216" observedRunningTime="2026-03-13 12:58:02.112575925 +0000 UTC m=+4197.750842688" watchObservedRunningTime="2026-03-13 12:58:02.122977652 +0000 UTC m=+4197.761244415" Mar 13 12:58:03 crc kubenswrapper[4837]: I0313 12:58:03.110447 4837 generic.go:334] "Generic (PLEG): container finished" podID="3f053d99-932e-4b5e-812a-6f58a6580bef" containerID="7f19f392263b0041f3e39b1838209aac48dd556dd18ed9e53e8ba99d4a96b20e" exitCode=0 Mar 13 12:58:03 crc kubenswrapper[4837]: I0313 12:58:03.110501 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556778-pbsh7" event={"ID":"3f053d99-932e-4b5e-812a-6f58a6580bef","Type":"ContainerDied","Data":"7f19f392263b0041f3e39b1838209aac48dd556dd18ed9e53e8ba99d4a96b20e"} Mar 13 12:58:04 crc kubenswrapper[4837]: I0313 12:58:04.483260 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556778-pbsh7" Mar 13 12:58:04 crc kubenswrapper[4837]: I0313 12:58:04.675873 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv4qx\" (UniqueName: \"kubernetes.io/projected/3f053d99-932e-4b5e-812a-6f58a6580bef-kube-api-access-qv4qx\") pod \"3f053d99-932e-4b5e-812a-6f58a6580bef\" (UID: \"3f053d99-932e-4b5e-812a-6f58a6580bef\") " Mar 13 12:58:04 crc kubenswrapper[4837]: I0313 12:58:04.681245 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f053d99-932e-4b5e-812a-6f58a6580bef-kube-api-access-qv4qx" (OuterVolumeSpecName: "kube-api-access-qv4qx") pod "3f053d99-932e-4b5e-812a-6f58a6580bef" (UID: "3f053d99-932e-4b5e-812a-6f58a6580bef"). InnerVolumeSpecName "kube-api-access-qv4qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:58:04 crc kubenswrapper[4837]: I0313 12:58:04.778560 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv4qx\" (UniqueName: \"kubernetes.io/projected/3f053d99-932e-4b5e-812a-6f58a6580bef-kube-api-access-qv4qx\") on node \"crc\" DevicePath \"\"" Mar 13 12:58:05 crc kubenswrapper[4837]: I0313 12:58:05.134070 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556778-pbsh7" event={"ID":"3f053d99-932e-4b5e-812a-6f58a6580bef","Type":"ContainerDied","Data":"a7aee66ea0867e20081ad9cc8dd0c99b97ad869ab35b6a16cece45cd867b13aa"} Mar 13 12:58:05 crc kubenswrapper[4837]: I0313 12:58:05.134120 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7aee66ea0867e20081ad9cc8dd0c99b97ad869ab35b6a16cece45cd867b13aa" Mar 13 12:58:05 crc kubenswrapper[4837]: I0313 12:58:05.134163 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556778-pbsh7" Mar 13 12:58:05 crc kubenswrapper[4837]: I0313 12:58:05.188830 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556772-5k6fp"] Mar 13 12:58:05 crc kubenswrapper[4837]: I0313 12:58:05.197170 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556772-5k6fp"] Mar 13 12:58:07 crc kubenswrapper[4837]: I0313 12:58:07.058768 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ba6a258-c015-4c82-b7d0-736ea4ddf3a0" path="/var/lib/kubelet/pods/1ba6a258-c015-4c82-b7d0-736ea4ddf3a0/volumes" Mar 13 12:58:33 crc kubenswrapper[4837]: I0313 12:58:33.482504 4837 scope.go:117] "RemoveContainer" containerID="202f4741378dc74444f14dd2386ad8db9f6a085bd9e8216a4ebc85b491ab3c81" Mar 13 12:59:35 crc kubenswrapper[4837]: I0313 12:59:35.484771 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:59:35 crc kubenswrapper[4837]: I0313 12:59:35.486235 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.148262 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556780-bhggf"] Mar 13 13:00:00 crc kubenswrapper[4837]: E0313 13:00:00.149290 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f053d99-932e-4b5e-812a-6f58a6580bef" containerName="oc" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.149305 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f053d99-932e-4b5e-812a-6f58a6580bef" containerName="oc" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.149503 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f053d99-932e-4b5e-812a-6f58a6580bef" containerName="oc" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.150081 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556780-bhggf" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.156594 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.157293 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.157292 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.159499 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww"] Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.160861 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.162262 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.162698 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.171218 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww"] Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.182230 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556780-bhggf"] Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.243142 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa971b27-9617-4542-83d2-0f99d06a6d7a-config-volume\") pod \"collect-profiles-29556780-6ffww\" (UID: \"fa971b27-9617-4542-83d2-0f99d06a6d7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.243208 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa971b27-9617-4542-83d2-0f99d06a6d7a-secret-volume\") pod \"collect-profiles-29556780-6ffww\" (UID: \"fa971b27-9617-4542-83d2-0f99d06a6d7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.243513 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5nnn\" (UniqueName: \"kubernetes.io/projected/fa971b27-9617-4542-83d2-0f99d06a6d7a-kube-api-access-j5nnn\") pod \"collect-profiles-29556780-6ffww\" (UID: \"fa971b27-9617-4542-83d2-0f99d06a6d7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.243618 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7nsj\" (UniqueName: \"kubernetes.io/projected/293685bd-214f-4596-863a-1e9ecee9d95b-kube-api-access-z7nsj\") pod \"auto-csr-approver-29556780-bhggf\" (UID: \"293685bd-214f-4596-863a-1e9ecee9d95b\") " pod="openshift-infra/auto-csr-approver-29556780-bhggf" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.345767 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa971b27-9617-4542-83d2-0f99d06a6d7a-config-volume\") pod \"collect-profiles-29556780-6ffww\" (UID: \"fa971b27-9617-4542-83d2-0f99d06a6d7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.345875 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa971b27-9617-4542-83d2-0f99d06a6d7a-secret-volume\") pod \"collect-profiles-29556780-6ffww\" (UID: \"fa971b27-9617-4542-83d2-0f99d06a6d7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.346051 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5nnn\" (UniqueName: \"kubernetes.io/projected/fa971b27-9617-4542-83d2-0f99d06a6d7a-kube-api-access-j5nnn\") pod \"collect-profiles-29556780-6ffww\" (UID: \"fa971b27-9617-4542-83d2-0f99d06a6d7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.346122 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7nsj\" (UniqueName: \"kubernetes.io/projected/293685bd-214f-4596-863a-1e9ecee9d95b-kube-api-access-z7nsj\") pod \"auto-csr-approver-29556780-bhggf\" (UID: \"293685bd-214f-4596-863a-1e9ecee9d95b\") " pod="openshift-infra/auto-csr-approver-29556780-bhggf" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.346859 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa971b27-9617-4542-83d2-0f99d06a6d7a-config-volume\") pod \"collect-profiles-29556780-6ffww\" (UID: \"fa971b27-9617-4542-83d2-0f99d06a6d7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.352159 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa971b27-9617-4542-83d2-0f99d06a6d7a-secret-volume\") pod \"collect-profiles-29556780-6ffww\" (UID: \"fa971b27-9617-4542-83d2-0f99d06a6d7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.366097 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7nsj\" (UniqueName: \"kubernetes.io/projected/293685bd-214f-4596-863a-1e9ecee9d95b-kube-api-access-z7nsj\") pod \"auto-csr-approver-29556780-bhggf\" (UID: \"293685bd-214f-4596-863a-1e9ecee9d95b\") " pod="openshift-infra/auto-csr-approver-29556780-bhggf" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.366881 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5nnn\" (UniqueName: \"kubernetes.io/projected/fa971b27-9617-4542-83d2-0f99d06a6d7a-kube-api-access-j5nnn\") pod \"collect-profiles-29556780-6ffww\" (UID: \"fa971b27-9617-4542-83d2-0f99d06a6d7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.481530 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556780-bhggf" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.496681 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.941001 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww"] Mar 13 13:00:00 crc kubenswrapper[4837]: W0313 13:00:00.951513 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod293685bd_214f_4596_863a_1e9ecee9d95b.slice/crio-ebbeba1290baa1b891587325e104a68b547b63d168b34d3f0777b61e45c84747 WatchSource:0}: Error finding container ebbeba1290baa1b891587325e104a68b547b63d168b34d3f0777b61e45c84747: Status 404 returned error can't find the container with id ebbeba1290baa1b891587325e104a68b547b63d168b34d3f0777b61e45c84747 Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.959620 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.964652 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556780-bhggf"] Mar 13 13:00:01 crc kubenswrapper[4837]: I0313 13:00:01.228591 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" event={"ID":"fa971b27-9617-4542-83d2-0f99d06a6d7a","Type":"ContainerStarted","Data":"8d7cdc4c7b18315fcb8174acd2c4d4ae189b674f4f7257f266a090ea7f6dfe22"} Mar 13 13:00:01 crc kubenswrapper[4837]: I0313 13:00:01.228713 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" event={"ID":"fa971b27-9617-4542-83d2-0f99d06a6d7a","Type":"ContainerStarted","Data":"76dab8b6b65f6b1976c605d530d9fb847ddfd01e344a115de3fff71ccf82fb1e"} Mar 13 13:00:01 crc kubenswrapper[4837]: I0313 13:00:01.230602 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556780-bhggf" event={"ID":"293685bd-214f-4596-863a-1e9ecee9d95b","Type":"ContainerStarted","Data":"ebbeba1290baa1b891587325e104a68b547b63d168b34d3f0777b61e45c84747"} Mar 13 13:00:01 crc kubenswrapper[4837]: I0313 13:00:01.243930 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" podStartSLOduration=1.243914389 podStartE2EDuration="1.243914389s" podCreationTimestamp="2026-03-13 13:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:00:01.241941067 +0000 UTC m=+4316.880207830" watchObservedRunningTime="2026-03-13 13:00:01.243914389 +0000 UTC m=+4316.882181152" Mar 13 13:00:02 crc kubenswrapper[4837]: I0313 13:00:02.240913 4837 generic.go:334] "Generic (PLEG): container finished" podID="fa971b27-9617-4542-83d2-0f99d06a6d7a" containerID="8d7cdc4c7b18315fcb8174acd2c4d4ae189b674f4f7257f266a090ea7f6dfe22" exitCode=0 Mar 13 13:00:02 crc kubenswrapper[4837]: I0313 13:00:02.240982 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" event={"ID":"fa971b27-9617-4542-83d2-0f99d06a6d7a","Type":"ContainerDied","Data":"8d7cdc4c7b18315fcb8174acd2c4d4ae189b674f4f7257f266a090ea7f6dfe22"} Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.019146 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.112968 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa971b27-9617-4542-83d2-0f99d06a6d7a-config-volume\") pod \"fa971b27-9617-4542-83d2-0f99d06a6d7a\" (UID: \"fa971b27-9617-4542-83d2-0f99d06a6d7a\") " Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.113152 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5nnn\" (UniqueName: \"kubernetes.io/projected/fa971b27-9617-4542-83d2-0f99d06a6d7a-kube-api-access-j5nnn\") pod \"fa971b27-9617-4542-83d2-0f99d06a6d7a\" (UID: \"fa971b27-9617-4542-83d2-0f99d06a6d7a\") " Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.113182 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa971b27-9617-4542-83d2-0f99d06a6d7a-secret-volume\") pod \"fa971b27-9617-4542-83d2-0f99d06a6d7a\" (UID: \"fa971b27-9617-4542-83d2-0f99d06a6d7a\") " Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.113896 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa971b27-9617-4542-83d2-0f99d06a6d7a-config-volume" (OuterVolumeSpecName: "config-volume") pod "fa971b27-9617-4542-83d2-0f99d06a6d7a" (UID: "fa971b27-9617-4542-83d2-0f99d06a6d7a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.115132 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa971b27-9617-4542-83d2-0f99d06a6d7a-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.119097 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa971b27-9617-4542-83d2-0f99d06a6d7a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fa971b27-9617-4542-83d2-0f99d06a6d7a" (UID: "fa971b27-9617-4542-83d2-0f99d06a6d7a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.119552 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa971b27-9617-4542-83d2-0f99d06a6d7a-kube-api-access-j5nnn" (OuterVolumeSpecName: "kube-api-access-j5nnn") pod "fa971b27-9617-4542-83d2-0f99d06a6d7a" (UID: "fa971b27-9617-4542-83d2-0f99d06a6d7a"). InnerVolumeSpecName "kube-api-access-j5nnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.217159 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5nnn\" (UniqueName: \"kubernetes.io/projected/fa971b27-9617-4542-83d2-0f99d06a6d7a-kube-api-access-j5nnn\") on node \"crc\" DevicePath \"\"" Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.217196 4837 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa971b27-9617-4542-83d2-0f99d06a6d7a-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.257938 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" event={"ID":"fa971b27-9617-4542-83d2-0f99d06a6d7a","Type":"ContainerDied","Data":"76dab8b6b65f6b1976c605d530d9fb847ddfd01e344a115de3fff71ccf82fb1e"} Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.257980 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76dab8b6b65f6b1976c605d530d9fb847ddfd01e344a115de3fff71ccf82fb1e" Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.258061 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.314883 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8"] Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.324885 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8"] Mar 13 13:00:05 crc kubenswrapper[4837]: I0313 13:00:05.063222 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c6ce131-8677-48bc-8f07-b53837bd751b" path="/var/lib/kubelet/pods/3c6ce131-8677-48bc-8f07-b53837bd751b/volumes" Mar 13 13:00:05 crc kubenswrapper[4837]: I0313 13:00:05.272748 4837 generic.go:334] "Generic (PLEG): container finished" podID="293685bd-214f-4596-863a-1e9ecee9d95b" containerID="66ef54191b2db003069fe2c0851e73076dafa7515d7eec4f4c56684a523cab10" exitCode=0 Mar 13 13:00:05 crc kubenswrapper[4837]: I0313 13:00:05.272808 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556780-bhggf" event={"ID":"293685bd-214f-4596-863a-1e9ecee9d95b","Type":"ContainerDied","Data":"66ef54191b2db003069fe2c0851e73076dafa7515d7eec4f4c56684a523cab10"} Mar 13 13:00:05 crc kubenswrapper[4837]: I0313 13:00:05.484484 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 13:00:05 crc kubenswrapper[4837]: I0313 13:00:05.485021 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 13:00:06 crc kubenswrapper[4837]: I0313 13:00:06.599872 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556780-bhggf" Mar 13 13:00:06 crc kubenswrapper[4837]: I0313 13:00:06.664001 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7nsj\" (UniqueName: \"kubernetes.io/projected/293685bd-214f-4596-863a-1e9ecee9d95b-kube-api-access-z7nsj\") pod \"293685bd-214f-4596-863a-1e9ecee9d95b\" (UID: \"293685bd-214f-4596-863a-1e9ecee9d95b\") " Mar 13 13:00:06 crc kubenswrapper[4837]: I0313 13:00:06.669562 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/293685bd-214f-4596-863a-1e9ecee9d95b-kube-api-access-z7nsj" (OuterVolumeSpecName: "kube-api-access-z7nsj") pod "293685bd-214f-4596-863a-1e9ecee9d95b" (UID: "293685bd-214f-4596-863a-1e9ecee9d95b"). InnerVolumeSpecName "kube-api-access-z7nsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:00:06 crc kubenswrapper[4837]: I0313 13:00:06.765870 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7nsj\" (UniqueName: \"kubernetes.io/projected/293685bd-214f-4596-863a-1e9ecee9d95b-kube-api-access-z7nsj\") on node \"crc\" DevicePath \"\"" Mar 13 13:00:07 crc kubenswrapper[4837]: I0313 13:00:07.292915 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556780-bhggf" event={"ID":"293685bd-214f-4596-863a-1e9ecee9d95b","Type":"ContainerDied","Data":"ebbeba1290baa1b891587325e104a68b547b63d168b34d3f0777b61e45c84747"} Mar 13 13:00:07 crc kubenswrapper[4837]: I0313 13:00:07.292950 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556780-bhggf" Mar 13 13:00:07 crc kubenswrapper[4837]: I0313 13:00:07.292962 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebbeba1290baa1b891587325e104a68b547b63d168b34d3f0777b61e45c84747" Mar 13 13:00:07 crc kubenswrapper[4837]: I0313 13:00:07.666223 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556774-vbb4z"] Mar 13 13:00:07 crc kubenswrapper[4837]: I0313 13:00:07.674901 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556774-vbb4z"] Mar 13 13:00:09 crc kubenswrapper[4837]: I0313 13:00:09.069378 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0473d8e9-f078-403a-a76b-c5bb02c0840d" path="/var/lib/kubelet/pods/0473d8e9-f078-403a-a76b-c5bb02c0840d/volumes" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515155005160024443 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015155005161017361 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015154774264016524 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015154774264015474 5ustar corecore